FabricAI.Prompt
Accessing DataReturns the result of passing input text to an AI model hosted in Microsoft Fabric.
Syntax
FabricAI.Prompt(input as text, optional context as any) as textParameters
| Name | Type | Required | Description |
|---|---|---|---|
input | text | Yes | The prompt text to send to the AI model. |
context | any | No | Optional additional data relevant to the request, typically provided as a record. This context is passed alongside the input to give the AI model supplementary information for generating its response. |
Return Value
text — The text response generated by the AI model based on the input prompt and optional context.
Remarks
FabricAI.Prompt sends a text prompt to an AI model within the Microsoft Fabric environment and returns the model's text response. This function enables AI-powered data enrichment, classification, summarization, and extraction directly within Power Query transformations.
Authentication: Uses the current user's Microsoft Fabric credentials automatically. No separate authentication configuration is required, as the function runs within the Fabric workspace context and inherits the workspace identity and permissions.
Context parameter: The optional context parameter accepts any value (commonly a record) that provides additional data for the AI model. For example, you can pass a record containing a column value from the current row, allowing the model to analyze row-level data. The context is included alongside the input prompt when calling the underlying AI model.
Query folding: Not supported. Each call to FabricAI.Prompt makes an independent request to the AI model. When used in Table.AddColumn to process many rows, each row triggers a separate AI call.
Platform availability: This function is available in Microsoft Fabric environments including Power BI Service (Fabric workspaces), Dataflows Gen2, and Fabric notebooks. It is not available in Power BI Desktop, Excel Desktop, or Excel Online. The function requires a Fabric capacity (F64 or higher) with AI features enabled.
Performance considerations: AI model calls introduce latency per row. For large datasets, consider filtering to a representative subset before applying FabricAI.Prompt, or use it on aggregated data rather than individual rows to minimize the number of API calls.
Usage limits: Subject to Fabric capacity limits and AI usage quotas. Monitor Fabric capacity metrics to ensure sufficient resources for AI workloads.
Examples
Example 1: Categorize a product review using AI
FabricAI.Prompt(
"Categorize the review as positive, negative, or neutral, and extract a single key word in parentheses.",
[Review = "This is a great product. It only broke four times before we had to return it!"]
)
// Output: "Negative (broke)"Example 2: Summarize a text description
FabricAI.Prompt(
"Summarize the following product description in one sentence.",
[Description = "This all-in-one blender features a 1200W motor, 6 stainless steel blades, and a BPA-free pitcher."]
)Example 3: Add AI-generated categories to a table
let
Source = #table({"Review"}, {
{"Absolutely love this product!"},
{"Terrible quality, broke on day one."},
{"It works fine, nothing special."}
}),
Categorized = Table.AddColumn(Source, "Sentiment", each
FabricAI.Prompt("Classify as positive, negative, or neutral:", [Review = [Review]])
)
in
Categorized