Skip to main content

Host custom MCP servers using Databricks apps

Host custom or third-party MCP servers as Databricks apps. Custom MCP servers are useful if you already have an MCP server you want to deploy or if you want to run a third-party MCP server as a source of tools.

Requirements

  • An MCP server hosted as a Databricks app must implement an HTTP-compatible transport, such as the streamable HTTP transport.

Create or deploy a custom MCP server

Choose the option that matches your starting point:

File new icon. Create from a template: build a new MCP server from scratch

Create an custom MCP server from the Apps template

Use the built-in Hello World MCP Server template to create and deploy an MCP server with example tools already included:

  1. In the sidebar, click Compute.

  2. Click the Apps tab.

  3. Click Create app.

  4. Under the Agents category, select the Hello World MCP Server template.

  5. Enter an app name starting with mcp- (for example, mcp-hello-world).

    note

    The app name must start with mcp- to be recognized as an MCP server in the AI Playground.

  6. Click Create app.

Databricks deploys the app with example code that you can customize.

The template includes two example tools to get you started:

  • health(): A diagnostic tool that confirms the server is operational and returns status information.
  • get_current_user(): A tool that retrieves the current user's information using the Databricks SDK, demonstrating how to integrate workspace authentication.

Add a custom tool

To add your own tool, open the app source code and define a new function using the @mcp.tool() decorator. For example, the following tool converts a string to uppercase:

Python
@mcp.tool()
def uppercase(text: str) -> str:
"""Convert a string to uppercase."""
return text.upper()

Each tool must include a docstring. AI agents use the docstring to understand when to call the tool. After adding a tool, redeploy the app to make it available.

See Create an app from a template for more details on working with app templates, or see the template source code on GitHub.

Cloud upload icon. Deploy an existing server: deploy an existing MCP server as a Databricks App

Host an existing MCP server as a Databricks App

To host an existing Python MCP server as a Databricks app, follow these steps:

Set up your environment

Before deploying your MCP server, authenticate to your workspace using OAuth.

  1. Run the following in a local terminal:

    Bash
    databricks auth login --host https://<your-workspace-hostname>

Set up the MCP server

Use uv for dependency management and unified tooling when deploying your MCP server.

  1. Add a requirements.txt to the MCP server's root directory and include uv as a dependency.

    uv handles installing additional dependencies defined in your project configuration.

    Txt
    uv

  2. Create a pyproject.toml file that defines a script entry point for your server.

    Example pyproject.toml:

    Toml
    [project.scripts]
    custom-server = "server.main:main"

    In this example:

    • custom-server is the script name you use with uv run
    • server.main:main specifies the module path (server/main.py) and function (main) to execute
  3. Add an app.yaml file specifying the CLI command to run the MCP server using uv run.

    By default, Databricks apps listen on port 8000. If the server listens on a different port, set it using an environment variable override in the app.yaml file.

    Example app.yaml:

    YAML
    command: [
    'uv',
    'run',
    'custom-server', # This must match a script defined in pyproject.toml
    ]

When you run uv run custom-server, uv looks up the script definition, finds the module path, and calls the main() function.

Deploy the MCP server as a Databricks app

  1. Create a Databricks app to host the MCP server:

    Bash
    databricks apps create mcp-my-server
    note

    Prefix your app name with mcp- to clearly identify it as an MCP server. This naming convention helps with discoverability and organization in your workspace.

  2. Upload the source code to Databricks and deploy the app by running the following commands from the directory containing your app.yaml file:

    Bash
    DATABRICKS_USERNAME=$(databricks current-user me | jq -r .userName)
    databricks sync . "/Users/$DATABRICKS_USERNAME/mcp-my-server"
    databricks apps deploy mcp-my-server --source-code-path "/Workspace/Users/$DATABRICKS_USERNAME/mcp-my-server"

Find your deployed app URL

After deployment, you can find your app URL in the Databricks UI. The MCP server endpoint is available at https://<app-url>/mcp.

Connect to the custom MCP server

Click the tabs to see how to connect to an MCP server from various environments.

Authenticate to your workspace using OAuth as described in Set up your environment.

The following example shows how to connect to the custom MCP server and list available tools:

Python
from databricks_mcp import DatabricksMCPClient
from databricks.sdk import WorkspaceClient

# Replace with your deployed app URL
# Example: https://mcp-my-server-6051921418418893.aws.databricksapps.com/mcp
mcp_server_url = "https://<app-url>/mcp"

databricks_cli_profile = "DEFAULT"
workspace_client = WorkspaceClient(profile=databricks_cli_profile)

mcp_client = DatabricksMCPClient(server_url=mcp_server_url, workspace_client=workspace_client)

# List available tools
tools = mcp_client.list_tools()
print(f"Available tools: {tools}")

Example notebooks: Build an agent with Databricks MCP servers

The following notebooks show how to author LangGraph and OpenAI agents that call MCP tools with custom MCP servers hosted on Databricks apps.

LangGraph MCP tool-calling agent

Open notebook in new tab

OpenAI MCP tool-calling agent

Open notebook in new tab

Additional resources

The apps cookbook provides end-to-end code examples for integrating MCP servers with different frameworks:

For complete source code and additional examples, see the Databricks Apps Cookbook repository.