AI Gateway
This page covers the new AI Gateway (visible in the left nav of the UI), which is currently in Beta. Account admins can enable access to this feature in the account console Previews page. See Manage Databricks previews.
For details on the previous version of AI Gateway, see AI Gateway for serving endpoints.
AI Gateway is the Databricks central AI governance layer for LLM endpoints, MCP servers, and coding agents. Use AI Gateway to analyze usage, configure permissions, enforce guardrails, and manage capacity across providers.
LLMs
AI Gateway provides enterprise governance for LLM endpoints and coding agents, including a unified UI, improved observability, and expanded API coverage.
Topic | Description |
|---|---|
Learn about AI Gateway (Beta) for LLM endpoints and how to get started. | |
Create and configure AI Gateway endpoints for your LLMs and coding agents. | |
Query AI Gateway endpoints using the OpenAI client and other supported APIs. | |
Monitor usage and costs for AI Gateway endpoints using system tables. | |
Monitor and audit requests and responses in Unity Catalog Delta tables. | |
Enforce consumption limits on AI Gateway endpoints to manage capacity and costs. | |
Integrate coding agents like Cursor, Gemini CLI, Codex CLI, and Claude Code with AI Gateway. |
AI Gateway features don't incur charges during Beta.
MCPs
AI Gateway provides governance for MCP servers, giving you visibility, access control, and audit logging across all MCP interactions.
Topic | Description |
|---|---|
Learn about MCP server types on Databricks and how to get started. | |
Immediately access Databricks features using pre-configured MCP servers. | |
Securely connect to MCP servers hosted outside of Databricks using managed connections. | |
Host a custom MCP server as a Databricks App. | |
Connect MCP clients to your Databricks MCP servers. |
Model serving endpoints (previous)
The previous version of AI Gateway provides governance features for model serving endpoints, including external model endpoints, Foundation Model API endpoints, and custom model endpoints.
Topic | Description |
|---|---|
Learn about AI Gateway features for serving endpoints, including supported features and limitations. | |
Configure AI Gateway features such as usage tracking, payload logging, rate limits, and guardrails on a model serving endpoint. | |
Monitor served models using AI Gateway-enabled inference tables | Monitor served models using AI Gateway-enabled inference tables. |