Skip to main content
When using Databricks Genie as your AI model provider, you can use AI models deployed through Databricks serving endpoints to power AI features in Omni.

Requirements

Databricks Genie requires you to create and configure your own serving endpoints in Databricks before configuring them in Omni. Unlike other providers, there are no pre-configured model options.
To follow the steps in this guide, you’ll need:
  • Organization Admin permissions in Omni
  • In Databricks:

Configuration

1
In Omni, navigate to Settings > AI > Model.
2
Select Databricks Genie as the Provider.
3
Define the Query model and Text/reasoning model. You can use the same endpoint for both models or specify different endpoints for different tasks.
The endpoint names you provide must exactly match the serving endpoint names configured in your Databricks workspace. For example, if you named a serving endpoint omni-model-provider, you would enter omni-model-provider in Omni.
4
Add the connection details:
  • Base URL - Enter the serving endpoint URL of your Databricks workspace, for example: https://<your-workspace>.cloud.databricks.com/serving-endpoints
  • API key - Enter your Databricks Personal Access Token
5
Click Save.
After the setup is successfully completed, Omni will use Databricks Genie for AI features. Try asking the AI Assistant a few questions to test the setup.