Skip to main content

Authentication for AI agents

AI agents often need to authenticate to other resources to complete tasks. For example, a deployed agent might need to access a Vector Search index to query unstructured data, a serving endpoint to call a foundation model, or Unity Catalog functions to execute custom logic.

This page covers authentication methods for agents deployed on Databricks Apps. For agents deployed on Model Serving endpoints, see Authentication for AI agents (Model Serving).

Databricks Apps provides two authentication methods for agents. Each method serves different use cases:

Method

Description

When to use

App authorization

Agent authenticates using an automatically created service principal with consistent permissions. Previously called Service Principal authentication.

Most common use case. Use when all users should have the same access to resources.

User authorization

Agent authenticates using the identity of the user making the request. Previously called On-Behalf-Of (OBO) authentication.

Use when you need user-specific permissions, audit trails, or fine-grained access control with Unity Catalog.

You can combine both methods in a single agent. For example, use app authorization to access a shared Vector Search index while using user authorization to query user-specific tables.

Configure authentication with the workspace UI or Declarative Automation Bundles

You can configure all authentication settings in two ways:

  • Workspace UI: Edit the app and manage resources and scopes from the Configure step. Recommended when you're iterating on a single app in the workspace.
  • Declarative Automation Bundles: Declare resources, scopes, and environment variables in a databricks.yml file and deploy with databricks bundle deploy. Recommended when you want Git-based versioning, CI/CD, or to ship the same agent across workspaces. All agent templates ship with a databricks.yml.

Both paths produce the same runtime configuration. The rest of this page shows each instruction in both forms so you can select one and stay consistent within your project.

To add a resource to the app through either path, you must have Can Manage permission on both the resource and the app.

For the full bundle reference, see app resource and app.resources. For an end-to-end bundle walkthrough, see Manage Databricks apps using Declarative Automation Bundles.

App authorization

By default, Databricks Apps authenticate using app authorization. Databricks automatically creates a service principal when you create the app, and it acts as the app's identity.

All users who interact with the app share the same permissions defined for the service principal. This model works well when you want all users to see the same data or when the app performs shared operations not tied to user-specific access controls.

For detailed information about app authorization, see App authorization.

Grant permissions to the MLflow experiment

Your agent needs access to an MLflow experiment to log traces and evaluation results. Grant the service principal Can Edit permission on the experiment.

  1. Click Edit on your app home page.
  2. Go to the Configure step.
  3. In the App resources section, add the MLflow experiment resource with Can Edit permission.

See Add an MLflow experiment resource to a Databricks app.

Grant permissions to other Databricks resources

If your agent uses other Databricks resources, such as Genie spaces, Vector Search indexes, or SQL warehouses, grant the service principal permissions on each one.

To access the prompt registry, grant CREATE FUNCTION, EXECUTE, and MANAGE permissions on the Unity Catalog schema for storing prompts.

When granting access to Unity Catalog resources, you must also grant permissions to all downstream dependent resources. For example, if you grant access to a Genie space, you must also grant access to its underlying tables, SQL warehouses, and Unity Catalog functions.

Add resources to the app through the App resources section when you create or edit the app in the Databricks workspace.

  1. Click Edit on your app home page.
  2. Go to the Configure step.
  3. In App resources, click + Add resource for each resource the agent uses and set the permission.

See Add resources to a Databricks app for the complete list of supported resources and screenshots.

The following table lists the minimum permissions used in the examples above and the equivalent Declarative Automation Bundles value for each resource type:

Resource type

Workspace UI permission

Declarative Automation Bundles resource and permission

SQL Warehouse

Can Use

sql_warehouse with CAN_USE

Model Serving endpoint

Can Query

serving_endpoint with CAN_QUERY

Unity Catalog Function

Can Execute

uc_securable with securable_type: FUNCTION and EXECUTE

Genie space

Can Run

genie_space with CAN_RUN

Vector Search index

Can Select

uc_securable with securable_type: TABLE and SELECT

Unity Catalog Table

SELECT

uc_securable with securable_type: TABLE and SELECT

Unity Catalog Connection

Use Connection

uc_securable with securable_type: CONNECTION and USE_CONNECTION

Unity Catalog Volume

Can Read or Can Read and Write

uc_securable with securable_type: VOLUME and READ_VOLUME or WRITE_VOLUME

Lakebase (provisioned)

Can Connect and Create

database with CAN_CONNECT_AND_CREATE

Lakebase (autoscaling)

Can Connect and Create

postgres with CAN_CONNECT_AND_CREATE

Follow the principle of least privilege. Grant the service principal only the permissions the agent needs, and use a dedicated service principal per app. For the full list, see Security best practices.

User authorization

Preview

User authorization is in Public Preview. Your workspace admin must enable it before you can use user authorization.

User authorization allows an agent to act with the identity of the user making the request. This provides:

  • Per-user access to sensitive data
  • Fine-grained data controls enforced by Unity Catalog
  • User-specific audit trails
  • Automatic enforcement of row-level filters and column masks

Use user authorization when your agent needs to access resources using the requesting user's identity instead of the app's service principal.

How user authorization works

When you configure user authorization for your agent:

  1. Add API scopes to your app: Define which Databricks APIs the app can access on behalf of users. See Add scopes to an app.
  2. User credentials are downscoped: Databricks takes the user's credentials and restricts them to only the API scopes you defined.
  3. Token forwarding: The downscoped token is made available to your app through the x-forwarded-access-token HTTP header.
  4. MLflow AgentServer stores the token: The Agent Server automatically stores this token per request for convenient access in agent code.

Configure user authorization by adding scopes in the Databricks Apps UI when creating or editing your app, or programmatically using the API. See Add scopes to an app for detailed instructions.

Agents with user authorization can access the following Databricks resources:

  • SQL Warehouse
  • Genie Space
  • Files and directories
  • Model Serving Endpoint
  • Vector Search Index
  • Unity Catalog Connections
  • Unity Catalog Tables

Implement user authorization

To implement user authorization, you must add authorization scopes to your app. Scopes restrict what the app can do on the user's behalf. For the list of available scopes and scope semantics, see Scope-based security and privilege escalation.

  1. In the Databricks UI, go to your app's Authorization settings.
  2. Under User authorization, click + Add scope and select the scopes that the app needs to access resources on behalf of the user.
  3. Save the changes and restart the app.

To configure user authorization in your agent code, retrieve the header for this request from the AgentServer and construct a workspace client with those credentials.

  1. In your agent code, import the authentication utility:

    If using one of the provided templates from databricks/app-templates, import the provided utility:

    Python
    from databricks_app.utils import get_user_workspace_client

    Otherwise, import from the Agent Server utilities:

    Python
    from agent_server.utils import get_user_workspace_client

    The get_user_workspace_client() function uses the Agent Server to capture the x-forwarded-access-token header and constructs a workspace client with those user credentials, handling authentication between the user, app, and agent server.

  2. Initialize the workspace client at query time, not during app startup:

    important

    Call get_user_workspace_client() inside the invoke and stream handlers, not in __init__ or at app startup. User credentials are only available at query time when a user makes a request. Initializing during app startup will fail because no user context exists yet.

    Python
    # In your agent code (inside invoke or stream handler)
    user_client = get_user_workspace_client()


    # Use user_client to access Databricks resources with user permissions
    response = user_client.serving_endpoints.query(name="my-endpoint", inputs=inputs)

For a complete guide on adding scopes and understanding scope-based security, see Scope-based security and privilege escalation. Request only the minimum scopes your agent needs and log every action performed on behalf of a user; see Best practices for user authorization.

Authenticate to Databricks MCP servers

Databricks managed MCP servers expose Vector Search indexes and Unity Catalog functions as tools through URLs of the form https://<workspace>/api/2.0/mcp/vector-search/<catalog>/<schema> and https://<workspace>/api/2.0/mcp/functions/<catalog>/<schema>. For the list of available servers and their URL patterns, see Use Databricks managed MCP servers.

To authenticate, grant the agent's service principal (or the user, if using user authorization) access to every downstream resource in those schemas.

For example, if your agent uses the following MCP server URLs:

  • https://<your-workspace>/api/2.0/mcp/vector-search/prod/customer_support
  • https://<your-workspace>/api/2.0/mcp/vector-search/prod/billing
  • https://<your-workspace>/api/2.0/mcp/functions/prod/billing

You must grant access to every vector search index in prod.customer_support and prod.billing, and every Unity Catalog function in prod.billing.

Add each index and function as a resource under App resources. Follow the same steps as Grant permissions to other Databricks resources.

Custom MCP servers hosted as their own Databricks apps (app names prefixed with mcp-) are not yet supported as bundle resources. Grant the agent's service principal Can Use on the MCP server app manually with databricks apps update-permissions. See the custom-mcp-server skill in the agent templates repository.

Next steps