Skip to main content

Integrate Anthropic with Databricks Unity Catalog tools

Use Databricks Unity Catalog to integrate SQL and Python functions as tools in Anthropic SDK LLM calls. This integration combines the governance of Unity Catalog with Anthropic models to create powerful gen AI apps.

Requirements

  • Use Databricks Runtime 15.0 and above.

Integrate Unity Catalog tools with Anthropic

Run the following code in a notebook or Python script to create a Unity Catalog tool and use it while calling an Anthropic model.

  1. Install the Databricks Unity Catalog integration package for Anthropic.

    Python
    %pip install unitycatalog-anthropic[databricks]
    dbutils.library.restartPython()
  2. Create an instance of the Unity Catalog functions client.

    Python
    from unitycatalog.ai.core.base import get_uc_function_client

    client = get_uc_function_client()
  3. Create a Unity Catalog function written in Python.

    Python
    CATALOG = "your_catalog"
    SCHEMA = "your_schema"

    func_name = f"{CATALOG}.{SCHEMA}.weather_function"

    def weather_function(location: str) -> str:
    """
    Fetches the current weather from a given location in degrees Celsius.

    Args:
    location (str): The location to fetch the current weather from.
    Returns:
    str: The current temperature for the location provided in Celsius.
    """
    return f"The current temperature for {location} is 24.5 celsius"

    client.create_python_function(
    func=weather_function,
    catalog=CATALOG,
    schema=SCHEMA,
    replace=True
    )
  4. Create an instance of the Unity Catalog function as a toolkit.

    Python
    from unitycatalog.ai.anthropic.toolkit import UCFunctionToolkit

    # Create an instance of the toolkit
    toolkit = UCFunctionToolkit(function_names=[func_name], client=client)
  5. Use a tool call in Anthropic.

    Python
    import anthropic

    # Initialize the Anthropic client with your API key
    anthropic_client = anthropic.Anthropic(api_key="YOUR_ANTHROPIC_API_KEY")

    # User's question
    question = [{"role": "user", "content": "What's the weather in New York City?"}]

    # Make the initial call to Anthropic
    response = anthropic_client.messages.create(
    model="claude-3-5-sonnet-20240620", # Specify the model
    max_tokens=1024, # Use 'max_tokens' instead of 'max_tokens_to_sample'
    tools=toolkit.tools,
    messages=question # Provide the conversation history
    )

    # Print the response content
    print(response)
  6. Construct a tool response. The response from the Claude model contains a tool request metadata block if a tool needs to be called.

Python
from unitycatalog.ai.anthropic.utils import generate_tool_call_messages

# Call the UC function and construct the required formatted response
tool_messages = generate_tool_call_messages(
response=response,
client=client,
conversation_history=question
)

# Continue the conversation with Anthropic
tool_response = anthropic_client.messages.create(
model="claude-3-5-sonnet-20240620",
max_tokens=1024,
tools=toolkit.tools,
messages=tool_messages,
)

print(tool_response)

The unitycatalog.ai-anthropic package includes a message handler utility to simplify the parsing and handling of a call to the Unity Catalog function. The utility does the following:

  1. Detects tool calling requirements.
  2. Extracts tool calling information from the query.
  3. Performs the call to the Unity Catalog function.
  4. Parses the response from the Unity Catalog function.
  5. Craft the next message format to continue the conversation with Claude.
note

The entire conversation history must be provided in the conversation_history argument to the generate_tool_call_messages API. Claude models require the initialization of the conversation (the original user input question) and all subsequent LLM-generated responses and multi-turn tool call results.