Host custom MCP servers using Databricks apps
Host your own custom or third-party MCP servers as Databricks apps. Custom MCP servers are useful if you already have an MCP server you want to deploy or if you want to run a third-party MCP server as a source of tools.
Requirements
- An MCP server hosted as a Databricks app must implement an HTTP-compatible transport, such as the streamable HTTP transport.
Host an MCP server as a Databricks app
See GitHub - custom MCP server template for an example of writing your own MCP server and deploying it as a Databricks app.
To host an existing Python MCP server as a Databricks app, follow these steps:
Set up your environment
Before deploying your MCP server, authenticate to your workspace using OAuth.
-
Run the following in a local terminal:
Bashdatabricks auth login --host https://<your-workspace-hostname>
Set up the MCP server
Use uv for dependency management and unified tooling when deploying your MCP server.
-
Add a
requirements.txtto the MCP server's root directory and includeuvas a dependency.When you add
uvto requirements.txt, it handles installing additional dependencies defined in your project configuration.Txtuv -
Create a
pyproject.tomlfile that defines a script entry point for your server.Example
pyproject.toml:Toml[project.scripts]
custom-server = "server.main:main"In this example:
custom-serveris the script name you use withuv runserver.main:mainspecifies the module path (server/main.py) and function (main) to execute
-
Add an
app.yamlfile specifying the CLI command to run the MCP server usinguv run.By default, Databricks apps listen on port 8000. If the server listens on a different port, set it using an environment variable override in the
app.yamlfile.Example
app.yaml:YAMLcommand: [
'uv',
'run',
'custom-server', # This must match a script defined in pyproject.toml
]
When you run uv run custom-server, uv looks up the script definition, finds the module path, and calls the main() function.
Deploy the MCP server as a Databricks app
-
Create a Databricks app to host the MCP server:
Bashdatabricks apps create custom-mcp-server -
Upload the source code to Databricks and deploy the app by running the following commands from the directory containing your
app.yamlfile:BashDATABRICKS_USERNAME=$(databricks current-user me | jq -r .userName)
databricks sync . "/Users/$DATABRICKS_USERNAME/custom-mcp-server"
databricks apps deploy custom-mcp-server --source-code-path "/Workspace/Users/$DATABRICKS_USERNAME/custom-mcp-server"
Find your deployed app URL
After deployment, you can find your app URL in the Databricks UI. The MCP server endpoint is available at https://<app-url>/mcp.
Connect to the custom MCP server
Click the tabs to see how to connect to an MCP server from various environments.
- Local environment
- Notebook (service principal)
- Agent code (on-behalf-of-user)
- Agent code (service principal)
Authenticate to your workspace using OAuth as described in Set up your environment.
The following example shows how to connect to the custom MCP server and list available tools:
from databricks_mcp import DatabricksMCPClient
from databricks.sdk import WorkspaceClient
# Replace with your deployed app URL
# Example: https://custom-mcp-server-6051921418418893.staging.aws.databricksapps.com/mcp
mcp_server_url = "https://<app-url>/mcp"
databricks_cli_profile = "DEFAULT"
workspace_client = WorkspaceClient(profile=databricks_cli_profile)
mcp_client = DatabricksMCPClient(server_url=mcp_server_url, workspace_client=workspace_client)
# List available tools
tools = mcp_client.list_tools()
print(f"Available tools: {tools}")
Use a service principal to access the hosted Databricks app in a Databricks notebook.
from databricks_mcp import DatabricksMCPClient
from databricks.sdk import WorkspaceClient
# Replace with your deployed app URL
mcp_server_url = "https://<app-url>/mcp"
workspace_client = WorkspaceClient(
host="<workspace-url>",
client_id="<client-id>",
client_secret="<client-secret>"
)
mcp_client = DatabricksMCPClient(server_url=mcp_server_url, workspace_client=workspace_client)
# List available tools
tools = mcp_client.list_tools()
print(f"Available tools: {tools}")
Set up on-behalf-of-user authorization. See On-behalf-of-user authentication.
The following example shows how to enable on-behalf-of-user access using ModelServingUserCredentials to access the hosted Databricks app from an agent:
from databricks_mcp import DatabricksMCPClient
from databricks.sdk import WorkspaceClient
from databricks.sdk.credentials_provider import ModelServingUserCredentials
# Replace with your deployed app URL
mcp_server_url = "https://app-url>/mcp"
workspace_client = WorkspaceClient(credentials_strategy=ModelServingUserCredentials())
mcp_client = DatabricksMCPClient(server_url=mcp_server_url, workspace_client=workspace_client)
# List available tools
tools = mcp_client.list_tools()
print(f"Available tools: {tools}")
Log the agent model using apps.apps scope. See On-behalf-of-user authentication.
Enable System auth using service principal to access the hosted Databricks app from an agent:
from databricks_mcp import DatabricksMCPClient
from databricks.sdk import WorkspaceClient
# Replace with your deployed app URL
mcp_server_url = "https://<app-url>/mcp"
workspace_client = WorkspaceClient()
mcp_client = DatabricksMCPClient(server_url=mcp_server_url, workspace_client=workspace_client)
# List available tools
tools = mcp_client.list_tools()
print(f"Available tools: {tools}")
Log the agent model using DatabricksApps as a resource. See Automatic authentication passthrough.
Example notebooks: Build an agent with Databricks MCP servers
The following notebooks show how to author LangGraph and OpenAI agents that call MCP tools with custom MCP servers hosted on Databricks apps.
LangGraph MCP tool-calling agent
OpenAI MCP tool-calling agent
Additional resources
The apps cookbook provides end-to-end code examples for integrating MCP servers with different frameworks:
For complete source code and additional examples, see the Databricks Apps Cookbook repository.