Skip to main content

Host custom MCP servers using Databricks apps

Beta

This feature is in Beta.

Host your own custom or third-party MCP servers as Databricks apps. Custom MCP servers are useful if you already have an MCP server you want to deploy or if you want to run a third-party MCP server as a source of tools.

Requirements

  • An MCP server hosted as a Databricks app must implement an HTTP-compatible transport, such as the streamable HTTP transport.

Host an MCP server as a Databricks app

See the custom MCP server repo for an example of writing your own MCP server and deploying it as a Databricks app.

To host an existing Python MCP server as a Databricks app, follow these steps:

Set up your environment

  1. Use OAuth to authenticate to your workspace. Run the following in a local terminal:

    Bash
    databricks auth login --host https://<your-workspace-hostname>

Set up the MCP server

  1. Add a requirements.txt to your server's root directory and specify Python dependencies for your server.

    Python MCP servers often use uv for package management. If you use uv, add uv and it will handle installing additional dependencies.

  2. Add an app.yaml specifying the CLI command to run your server.

    By default, Databricks apps listen on port 8000. If your server listens on a different port, set it using an environment variable override in the app.yaml file.

    Example app.yaml:

    YAML
    command: [
    'uv',
    'run',
    'your-server-name',
    ..., # optionally include additional parameters here
    ]

Upload the MCP server as a Databricks app

  1. Create a Databricks app to host your MCP server:

    Bash
    databricks apps create mcp-my-custom-server
  2. Upload the source code to Databricks and deploy the app by running the following commands from the directory containing your app.yaml file:

    Bash
    DATABRICKS_USERNAME=$(databricks current-user me | jq -r .userName)
    databricks sync . "/Users/$DATABRICKS_USERNAME/mcp-my-custom-server"
    databricks apps deploy mcp-my-custom-server --source-code-path "/Workspace/Users/$DATABRICKS_USERNAME/mcp-my-custom-server"

Connect to the custom MCP server

Click the tabs to see how to connect to an MCP server from various environments.

Use OAuth to authenticate to your workspace. Run the following in a local terminal:

Bash
databricks auth login --host https://<your-workspace-hostname>

Use the authenticated profile from the previous step to access the hosted Databricks app:

Python
from databricks_mcp import DatabricksOAuthClientProvider
from databricks.sdk import WorkspaceClient
from mcp.client.session import ClientSession
from mcp.client.streamable_http import streamablehttp_client

databricks_cli_profile = "DEFAULT"
workspace_client = WorkspaceClient(profile=databricks_cli_profile)

# Replace with your actual custom MCP server URL
mcp_server_url = "https://<workspace-hostname>/serving-endpoints/mcp-my-custom-server/invocations"

async def test_connection_to_server():
async with streamablehttp_client(
f"{mcp_server_url}", auth=DatabricksOAuthClientProvider(workspace_client)
) as (read_stream, write_stream, _), ClientSession(
read_stream, write_stream
) as session:
# List available tools
tools = await session.list_tools()
print(f"Available tools: {[tool.name for tool in tools.tools]}")

Deploy an agent

The following notebooks show how to author LangGraph and OpenAI agents that call MCP tools.

LangGraph MCP tool-calling agent

Open notebook in new tab

OpenAI MCP tool-calling agent

Open notebook in new tab

Additional resources

The apps cookbook provides end-to-end code examples for integrating MCP servers with different frameworks:

For complete source code and additional examples, see the Databricks Apps Cookbook repository.