Skip to main content

AI Gateway

Beta

This page covers the new AI Gateway (visible in the left nav of the UI), which is currently in Beta. Account admins can enable access to this feature in the account console Previews page. See Manage Databricks previews.

For details on the previous version of AI Gateway, see AI Gateway for serving endpoints.

AI Gateway is the Databricks central AI governance layer for LLM endpoints, MCP servers, and coding agents. Use AI Gateway to analyze usage, configure permissions, enforce guardrails, and manage capacity across providers.

LLMs

AI Gateway provides enterprise governance for LLM endpoints and coding agents, including a unified UI, improved observability, and expanded API coverage.

Topic

Description

AI Gateway for LLM endpoints

Learn about AI Gateway (Beta) for LLM endpoints and how to get started.

Configure AI Gateway endpoints

Create and configure AI Gateway endpoints for your LLMs and coding agents.

Query AI Gateway endpoints

Query AI Gateway endpoints using the OpenAI client and other supported APIs.

Monitor usage for AI Gateway endpoints

Monitor usage and costs for AI Gateway endpoints using system tables.

Monitor models using inference tables

Monitor and audit requests and responses in Unity Catalog Delta tables.

Configure rate limits for AI Gateway endpoints

Enforce consumption limits on AI Gateway endpoints to manage capacity and costs.

Integrate with coding agents

Integrate coding agents like Cursor, Gemini CLI, Codex CLI, and Claude Code with AI Gateway.

note

AI Gateway features don't incur charges during Beta.

MCPs

AI Gateway provides governance for MCP servers, giving you visibility, access control, and audit logging across all MCP interactions.

Topic

Description

Model Context Protocol (MCP) on Databricks

Learn about MCP server types on Databricks and how to get started.

Use Databricks managed MCP servers

Immediately access Databricks features using pre-configured MCP servers.

Use external MCP servers

Securely connect to MCP servers hosted outside of Databricks using managed connections.

Host custom MCP servers using Databricks apps

Host a custom MCP server as a Databricks App.

Connect non-Databricks clients to Databricks MCP servers

Connect MCP clients to your Databricks MCP servers.

Model serving endpoints (previous)

The previous version of AI Gateway provides governance features for model serving endpoints, including external model endpoints, Foundation Model API endpoints, and custom model endpoints.

Topic

Description

AI Gateway for serving endpoints

Learn about AI Gateway features for serving endpoints, including supported features and limitations.

Configure AI Gateway on model serving endpoints

Configure AI Gateway features such as usage tracking, payload logging, rate limits, and guardrails on a model serving endpoint.

Monitor served models using AI Gateway-enabled inference tables

Monitor served models using AI Gateway-enabled inference tables.