Requirements
To follow this guide, you’ll need:- An understanding of Omni modeling concepts
- Familiarity with Omni’s model IDE
- Familiarity with basic AI terminology such as tokens, agents, etc.
AI context in Omni
What is AI context?
What is AI context?
Context is information such as:
- Descriptions of topics and fields
- Possible field values
- How a field might be used
Why provide context to AI?
Why provide context to AI?
Using context windows allows AI models to understand and incorporate relevant information. Specifically, providing context to the LLM makes it more effective, enabling it to interpret your request accurately and generate meaningful responses.
What does Omni AI use for context?
What does Omni AI use for context?
Omni uses the following for context, in priority order:
Prioritized field properties
fully_qualified_name(view_name.field_name) - For example,order_items.total_sale_pricename- For example,total_sale_pricealiasesai_contextall_values- Used to specify the possible values for the field
Pruned field properties
These properties will be pruned in the following order, but note that fields may be truncated entirely if required.
descriptiongroup_label- The categorization of the fieldlabel- If defined, the value will be used. If not provided, a title-cased field name with underscores removed will be used. For example,Total Sale Pricesemantic_type- The dimension or aggregate type, such asdimension,sum,count, etc. This value is evaluated based on the field definition and can’t be directly modified.data_type- The field’s data type, such asnumber,string, etc.sample_values- Example values for the fieldsynonyms- Other terms used to refer to the field
Where does context apply?
Where does context apply?
Context applies to:
- AI assistant / Embedded chat instances
- AI query helper (Blobby) in the workbook
- AI summary visualizations
- AI filter generator
What limitations does context have?
What limitations does context have?
Omni’s model can handle ~200K characters of context:
- ~15-25K is reserved for Omni app context, which is used to ensure queries are generated properly
- The remaining ~175K characters are allocated for context from the semantic layer before truncation, starting with field metadata
Topic metadata & prioritized field properties
You have a topic that you want to use for AI querying. Including the topic’s
description, name, base_view, ai_context, and the included fields’ prioritized properties (ex: name), let’s say the total characters used is around 10,000.Topic field metadata
In the topic, the included fields generally look like this:The total characters used for this field’s basic metadata equals 160 characters.
Field-level context
You’ll likely want to add field-level context, such as
description, sample_values, and ai_context. Let’s add 140 characters to account for the AI-specific metadata, which brings the total characters per field to 300 characters:How is context to the AI processed?
How is context to the AI processed?
Context provided to Omni’s AI is shared with AWS Bedrock. Refer to the AI data security guide for more information.
Curate AI outputs across the model
You can use the model parameterai_context to pass context shared across topics. This parameter may also be useful for topic selection in the AI Assistant.
Implement chain-of-thought reasoning for AI chat
If you want to have Blobby give a more thorough explanation of what is being generated (topic selection, field selection, etc.), you can include the following context within the model’sai_context.
This can be altered and tweaked if desired, but the reference to GenerateQuery is required for proper behavior.
Output summaries in multiple languages
If you’d like Blobby to summarize outputs in multiple languages, you can include the following context:Curate topics
Topics have anai_context parameter, which is useful for providing behavioral prompts and guidance for handling certain questions specific to the topic.
For example:
ai_context for e-commerce orders topic
Limit fields included in the context window
Blobby only uses the fields that exist within the context of a topic to output a response. Fields that have ahidden: true parameter are excluded from the context window. If you’d like to specify fields you want included in the context window, you can leverage the ai_fields parameter within a topic. For fields that you want to include, you can provide additional context at the field level.
While Blobby can create calculations and aggregations, we recommend periodically checking the Analytics > AI usage dashboard to see what questions your users are asking, so you can identify opportunities to promote logic to the shared model and improve the self-serve experience.
Reuse logic and limit the context window with topic extensions
Topics that you’re already leveraging in Omni can be extended to further curate them for AI. Using theextends parameter, you can reuse the definition of an existing topic without needing to repeat the code. Consider the following Order Transactions topic, which you want to extend to create a curated version dedicated to AI usage:
Order Transactions topic
extends: [ order_items ] to extend the Order Transactions topic. You can then specify what to include in the AI-specific topic, such as limiting fields, filtering the data for specific use cases, adding more AI context, and so on:
Orders for AI querying topic, extended from Order Transactions
Add example queries as context
Along with providing context about the topic itself, you can use the topic’ssample_queries to provide example questions. This approach is useful if you anticipate specific, recurring questions or you find that Blobby struggles with date filters.
To do this, you’ll want to:
- Create a query in a workbook that contains the correct answer to the question.
- In the workbook, click Model > Save as sample query to topic.
- When prompted, fill in the following:
- Label - A user-friendly name for the query
- Description - An optional description
- Display on topic overview - If checked, the query will display as a sample query when the topic is selected in a new query tab
- Include in AI context - If checked, the query will be included in the AI context for the topic
- Prompt - An optional example prompt that could be used to generate this query
- AI context - Optional, additional context for the AI
- When finished, click Save.
Curate views and fields
Views also have anai_context parameter which can be useful for passing context to AI that is specific to the view.
Keeping fields organized and labeled can not only help you create a top-notch self-service experience, it can also make Omni’s AI more efficient. The following parameters can be used to add metadata to fields for the purposes of AI:
ai_context- Adds context useful for AI responsesall_values- All possible values for the field- Note: when the dbt integration is enabled,
accepted_valuestests will be ingested asall_values
- Note: when the dbt integration is enabled,
sample_values- Example values for the fieldsynonyms- Other terms used to refer to the field
In the workbook
In the workbook
- In the field browser, click the (three dots icon) next to the field.
-
Click Modeling > Edit to open the Edit field side panel.

In the IDE
In the IDE
In the IDE, navigate to the view containing the field to add the parameters. For example:
products.view