Skip to main content

AI data privacy

Several features in Omni are powered by AI, allowing you to use natural language to ask questions about your data, get help creating queries, build calculations, and more.

LLM providers

Omni uses two LLM providers to power its AI features:

  • OpenAI
  • Amazon Web Services (AWS) Bedrock
note

Any data, including metadata, provided to the LLMs used by Omni will not used for training.

User prompt categorization

When a user enters a prompt, Omni will first use AWS Bedrock's hosted Claude model to categorize the request to determine how to respond. If the request is for query generation or building a workbook calculation, Omni will proceed using OpenAI.

Shared data

To determine the correct category for the prompt, Omni shares the prompt itself with AWS Bedrock. Specifically, this is the natural language question or prompt provided by the user. For example, "How many users signed up last month?" or "Filter by the last two years."

Query generation & workbook calculations

When generating a query, the output of the AI's processing is a new Omni query, which is a collection of metadata - including field names, filters, sorts, pivots, topic name, and limit - that is translated into SQL. This new query is then run within the current workbook to provide an answer to the user's prompt.

Because the AI uses an abstract query format and not SQL to create queries, the generated query respects the permissions set in Omni. This means it's not possible to access data outside of the topics, models, and connections a user has been restricted to in Omni.

Additionally, this approach ensures that no private, relational, or result set data is shared. Only metadata about the current query, user prompt, and selected topic fields is sent to OpenAI, thus maintaining the privacy and security of your data.

Note: While Omni hasn't opted to share data with OpenAI, Omni has also not specifically opted into OpenAI's zero day retention policy.

Shared data

While relational data is never shared, Omni does share certain metadata with OpenAI to generate accurate responses. This metadata includes:

DescriptionWhat's shared?
Current query metadataInformation about the currently selected query in the workbook, such as field names, sorts, limits, pivots, and filters
User promptThe natural language question or prompt provided by the user. For example, "How many users signed up last month?" or "Filter by the last two years."
Context (ai_context)Free text that provides context. This is set at the topic level using the ai_context parameter.
Fields within the selected topicMetadata about the fields in the currently selected topic, such as field names, labels, descriptions, data types, and whether they're aggregates (count, sum, average) or dimensions. Note: Even if a user has access to multiple topics in a workbook, only metadata for the topic that's currently selected is accessed.

Data summarization

Omni's AI uses AWS Bedrock to power its data summarization features, which includes using the AI summary visualization or the query helper to summarize the results of a query.

Shared data

note

Models are region-specific to ensure data privacy. For example, if you're based in the EU, the models you use will be deployed in an EU-based region to ensure your data never leaves your region.

To generate accurate summaries, the following is shared with AWS Bedrock:

All data will remain within AWS.

Feature reference

The following table details each of Omni's AI features and the LLM(s) it uses. Refer to the AI settings reference for information about enabling and disabling these features.

Note: For AWS Bedrock, items marked with an asterisk (*) indicate that data is shared with AWS only for prompt categorization purposes.

NameAWS BedrockOpenAI
Workbook calculationsYes* Yes
Workbook query helperYes* Yes
Blank workbook jumpstartYes* Yes
Visualization helperYes* Yes
IDE assistantYes* Yes
Data summarizationYes No X
AI summary visualizationYes No X