Organization Admin permissions are required to access and modify AI settings.
General
A next to a setting indicates that the related feature uses an LLM to process query results. Refer to the AI data privacy documentation for information on how Omni’s AI features process data.
Enable AI
Enable AI
Enables AI features in the instance. Must be enabled to individually enable or disable the other AI features in this tab.
Query assistant
Settings related to the AI chat experience. This setting group contains subsettings that use an LLM to process query results.
Query assistant
Settings related to the AI chat experience. This setting group contains subsettings that use an LLM to process query results.
Enables the Query helper in workbooks and the standalone AI Assistant page. There are additional subsettings that can be enabled:
- Chat - When enabled, the AI Assistant page is accessible outside of workbooks. This also enables embedding the AI Assistant into external applications.
- Blank workbook topic picking - Enables the Ask AI option in blank workbook tabs.
- Read result data - Allows the AI to read query results. Used to respond to summarization prompts, detect anomalies, and highlight insights in the chat interface. Must be enabled to use the Dashboard Assistant.
AI Visualization Summary
AI Visualization Summary
Branding
The AI > Branding tab is used to customize the appearance of the AI chat interface, including the Query helper, AI Assistant, and Dashboard Assistant. These settings impact the AI chat experience, including those that are externally embedded. To see how your changes will impact the AI assistant, click Save.Assistant name
Assistant name
The name of the AI assistant, used in chat conversations.
URL to assistant image
URL to assistant image
A publicly accessible image, used as the AI assistant’s icon. A square PNG (128x128 pixels) will produce the best results.
Intro headline
Intro headline
An optional greeting, displayed at the beginning of a chat session.
Intro body
Intro body
Optional additional text, displayed at the beginning of a chat session.
Prompt placeholder
Prompt placeholder
Placeholder text to display in the end user’s chat box.
Model
To enable custom AI model configuration for your organization, contact Omni support. Once enabled, Organization Admin permissions are required to access and modify AI model settings.
Supported providers
| Provider | API key required | Base URL | Description |
|---|---|---|---|
| AWS Bedrock | No | No | Default provider. Uses Omni’s managed AWS credentials and model selection. |
| Anthropic Direct | Yes | No | Direct API access to Anthropic’s Claude models. Allows you to select which model is used. |
| OpenAI | Yes | Yes | Access to GPT models. Base URL support allows integration with OpenAI-compatible APIs like Azure OpenAI and Ollama. |
| Snowflake Cortex | Yes | Yes | Access to Anthropic’s Claude models through Snowflake Cortex. Requires a Snowflake Programmatic Access Token (PAT) and base URL. |
| Grok (xAI) | Yes | No | Access to xAI’s Grok models. |
Configuration options
- Provider - The AI model provider to use for your organization. The provider determines which models are available and how API requests are handled.
- Query model - The model used for complex reasoning tasks, such as generating queries and answering detailed questions. Choose a more capable model (such as Sonnet-class for Anthropic or GPT-4o for OpenAI) for best results.
- Text model - The model used for simpler tasks, such as text completion and basic assistance. A smaller, faster model (such as Haiku-class for Anthropic or GPT-4o Mini for OpenAI) is recommended.
- Custom model - Override the default model selection with a custom model ID. Use this option to specify a model not listed in the default options, such as a fine-tuned model or a newer release.
- API key - Your API key for the selected provider. API keys are securely stored and not displayed after saving. Required for:
- Anthropic Direct
- OpenAI
- Snowflake Cortex - Use a Snowflake Programmatic Access Token (PAT)
- Grok
- Base URL - The base URL for API requests. Available for:
- OpenAI - Use this to connect to OpenAI-compatible APIs like Azure OpenAI or self-hosted Ollama instances
- Snowflake Cortex - Provide your Snowflake account’s base URL
Available models
When using a custom provider, you can select which model is used for AI features. The default AWS Bedrock provider uses Omni’s managed model selection and does not allow customization.- Anthropic
- OpenAI
- Snowflake Cortex
- Grok
Available when using the Anthropic Direct provider:
| Model | Class | Description |
|---|---|---|
| Claude Sonnet 4.5 | Query | Latest Sonnet model |
| Claude Sonnet 4 | Query | Recommended for complex reasoning tasks |
| Claude 3.5 Sonnet | Query | Previous generation Sonnet model |
| Claude Haiku 4 | Text | Recommended for simple, fast tasks |
| Claude 3.5 Haiku | Text | Previous generation Haiku model |
| Claude 3 Haiku | Text | Lightweight model for basic tasks |