ai_query function

Applies to: check marked yes Databricks SQL check marked yes Databricks Runtime

Preview

This feature is in Public Preview.

Invokes an existing Databricks Model Serving endpoint and parses and returns its response.

Requirements

  • This function is not available on Databricks SQL Classic.

  • Querying Foundation Model APIs is enabled by default. To query endpoints that serve custom models or external models, enable AI_Query for Custom Models and External Models in the Databricks Previews UI.

  • The current DLT warehouse channel does not use the latest Databricks Runtime version that supports ai_query(). Set the pipelines.channel in the table properties as 'preview' to use ai_query(). See Examples for a sample query.

Note

  • In Databricks Runtime 14.2 and above, this function is supported in Databricks notebooks, including notebooks that are run as a task in a Databricks workflow.

  • In Databricks Runtime 14.1 and below, this function is not supported in Databricks notebooks.

Syntax

To query an endpoint that serves an external model or a foundation model:

ai_query(endpointName, request)

To query a custom model serving endpoint with a model schema:

ai_query(endpointName, request)

To query a custom model serving endpoint without a model schema:

ai_query(endpointName, request, returnType)

Arguments

  • endpointName: A STRING literal, the name of a Databricks Foundation Model serving endpoint, an external model serving endpoint or a custom model endpoint in the same workspace for invocations. The definer must have CAN QUERY permission on the endpoint.

  • request: An expression, the request used to invoke the endpoint.

    • If the endpoint is an external model serving endpoint or Databricks Foundation Model APIs endpoint, the request must be a STRING.

    • If the endpoint is a custom model serving endpoint, the request can be a single column or a struct expression. The struct field names should match the input feature names expected by the endpoint.

  • returnType: An expression, the expected returnType from the endpoint. This is similar to the schema parameter in from_json function, which accepts both A STRING expression or invocation of schema_of_json function.

    • In Databricks Runtime 14.2 and above, if this expression is not provided, ai_query() automatically infers the return type from the model schema of the custom model serving endpoint.

    • In Databricks Runtime 14.1 and below, this expression is required for querying a custom model serving endpoint.

Returns

The parsed response from the endpoint.

Examples

To query an external model serving endpoint:

> SELECT ai_query(
    'my-external-model-openai-chat',
    'Describe Databricks SQL in 30 words.'
  ) AS summary

  "Databricks SQL is a cloud-based platform for data analytics and machine learning, providing a unified workspace for collaborative data exploration, analysis, and visualization using SQL queries."

To query a foundation model supported by Databricks Foundation Model APIs:

> SELECT *,
  ai_query(
    'databricks-meta-llama-3-1-70b-instruct',
    "Can you tell me the name of the US state that serves the provided ZIP code? zip code: " || pickup_zip
    )
  FROM samples.nyctaxi.trips
  LIMIT 10

Optionally, you can also wrap a call to ai_query() in a UDF for function calling as follows:

> CREATE FUNCTION correct_grammar(text STRING)
  RETURNS STRING
  RETURN ai_query(
    'databricks-llama-2-70b-chat',
    CONCAT('Correct this to standard English:\n', text));
> GRANT EXECUTE ON correct_grammar TO ds;
- DS fixes grammar issues in a batch.
> SELECT
    * EXCEPT text,
    correct_grammar(text) AS text
  FROM articles;

To query a custom model serving endpoint:

> SELECT text, ai_query(
    endpoint => 'spam-classification-endpoint',
    request => named_struct(
      'timestamp', timestamp,
      'sender', from_number,
      'text', text),
    returnType => 'BOOLEAN') AS is_spam
  FROM messages


> SELECT ai_query(
    'weekly-forecast',
    request => struct(*),
    returnType => 'FLOAT') AS predicted_revenue
  FROM retail_revenue


> SELECT ai_query(
    'custom-llama-2-7b-chat',
    request => named_struct("messages",
        ARRAY(named_struct("role", "user", "content", "What is ML?"))),
    returnType => 'STRUCT<candidates:ARRAY<STRING>>')

  {"candidates":["ML stands for Machine Learning. It's a subfield of Artificial Intelligence that involves the use of algorithms and statistical models to enable machines to learn from data, make decisions, and improve their performance on a specific task over time."]}

Example query for setting the DLT channel to preview:

> create or replace materialized view
    ai_query_mv
    TBLPROPERTIES('pipelines.channel' = 'PREVIEW') AS
  SELECT
    ai_query("databricks-dbrx-instruct", text) as response
  FROM
    messages