To enable custom AI model configuration for your organization, contact Omni support. Once enabled, Organization Admin permissions are required to access and modify AI model settings.
ai_settings parameter in the Omni model.
If you prefer to use a different provider or your own API key, Omni also supports the following providers: Anthropic Direct, OpenAI, Snowflake Cortex, Databricks Genie, or Grok (xAI). Provider settings are configured in Settings > AI > Model.
Supported providers
| Provider | Description |
|---|---|
| AWS Bedrock | Default provider. Uses Omni’s managed AWS credentials and model selection. |
| Anthropic Direct | Direct API access to Anthropic’s Claude models. Allows you to select which model is used. |
| OpenAI | Access to GPT models. Base URL support allows integration with OpenAI-compatible APIs like Azure OpenAI and Ollama. Supports custom model identifiers for bring-your-own-model (BYOM) configurations. |
| Snowflake Cortex | Access to Anthropic’s Claude models through Snowflake Cortex. |
| Databricks Genie | Access to models via Databricks serving endpoints. See Databricks Genie setup for details. |
| Grok (xAI) | Access to xAI’s Grok models. |
Configuration options
- Provider - The AI model provider to use for your organization. The provider determines which models are available and how API requests are handled.
- Query model - The model used for complex reasoning tasks, such as generating queries and answering detailed questions. Choose a more capable model (such as Sonnet-class for Anthropic or GPT-4o for OpenAI) for best results.
- Text/Reasoning model - The model used for simpler tasks, such as text completion and basic assistance. A smaller, faster model (such as Haiku-class for Anthropic or GPT-4o Mini for OpenAI) is recommended.
- Custom model - Applicable only to OpenAI. Override the default model selection with a custom model ID. See Custom model identifiers with OpenAI for more information.
- API key - Your API key for the selected provider. API keys are securely stored and not displayed after saving. Required for:
- Anthropic Direct
- OpenAI - Applicable to built-in models and custom models
- Snowflake Cortex - Use a Snowflake Programmatic Access Token (PAT)
- Databricks Genie - Use a Databricks Personal Access Token (PAT)
- Grok
- Base URL - The base URL for API requests. Available for:
- OpenAI - Use this to connect to OpenAI-compatible APIs like Azure OpenAI or self-hosted Ollama instances
- Snowflake Cortex - Provide your Snowflake account’s base URL
- Databricks Genie - Provide your Databricks workspace URL (e.g.,
https://your-workspace.cloud.databricks.com)