When using Databricks Genie as your AI model provider, you can use AI models deployed through Databricks serving endpoints to power AI features in Omni.Documentation Index
Fetch the complete documentation index at: https://docs.omni.co/llms.txt
Use this file to discover all available pages before exploring further.
Requirements
Databricks Genie requires you to create and configure your own serving endpoints in Databricks before configuring them in Omni. Unlike other providers, there are no pre-configured model options.
- Organization Admin permissions in Omni
- In Databricks:
- A Databricks workspace with serving endpoints configured. See the Databricks serving endpoints documentation for more information.
- A Databricks Personal Access Token (PAT) with permissions to access your serving endpoints. See the Databricks PAT documentation for more information.
- The name(s) of your Databricks serving endpoint(s)
Configuration
Define the Query model and Text/reasoning model. You can use the same endpoint for both models or specify different endpoints for different tasks.
The endpoint names you provide must exactly match the serving endpoint names configured in your Databricks workspace. For example, if you named a serving endpoint
omni-model-provider, you would enter omni-model-provider in Omni.Add the connection details:
- Base URL - Enter the serving endpoint URL of your Databricks workspace, for example:
https://<your-workspace>.cloud.databricks.com/serving-endpoints - API key - Enter your Databricks Personal Access Token