Build and share a chat UI with Databricks Apps
Use Databricks Apps to build a custom chat UI for your deployed agent. This lets you share your agent in an interface that gives you control over things like branding and how you display agent output.
If you're just getting started building your agent and need a chat UI for pre-production testing, use the built-in review app instead. The approach in this article is meant for use cases that require additional UI customization, where building a custom chat UI is warranted.
Example chat application
The example app is hosted on GitHub - e2e-chatbot-app. It uses Streamlit to create an app that does the following:
- Streaming output: The app attempts to stream agent output to see responses in real-time and falls back to non-streaming output if needed.
- Tool calls: The app renders tool calls for agents. The agent must be authored using the best practices recommended in Author AI agents in code.
- Feedback: The app collects user feedback on chat responses using the experimental API.
Requirements
-
You must have an agent deployed to a serving endpoint. It can be one of the following:
- A custom agent deployed using
agents.deploy()
. See Deploy an agent for generative AI applications. - A foundation model or external model serving endpoint with the Chat task type. See Supported foundation models on Mosaic AI Model Serving.
- A custom agent deployed using
-
Install Python 3.11 or above to run and test the app locally.
-
Install the Databricks CLI to deploy the app to Databricks. See Install or update the Databricks CLI.
Develop and run the Databricks app locally
Use the example app as a starting point for your own chat UI. Use the following steps to run the app locally:
-
Clone the app template repository:
Bashgit clone https://github.com/databricks/app-templates
-
Install the required libraries:
Bashcd e2e-chatbot-app
pip install -r requirements.txt -
To make calls to the agent endpoint, you must authenticate to your Databricks workspace. Generate a personal access token and save the token value. See Databricks personal access tokens for workspace users.
-
Configure the Databricks CLI:
Bashdatabricks configure
-
Provide the Databricks host url for your workspace:
https://hostname.cloud.databricks.com
and the Personal access token from earlier. -
Specify the model serving endpoint name and run the app. To find the model serving endpoint name, go to your workspace and select Serving to see a list of model serving endpoints:
Bashexport SERVING_ENDPOINT=<your-serving-endpoint-name>
streamlit run app.py
Deploy the Databricks app
Deploy the example as a Databricks app to share it with others.
-
Run
databricks app create
to create the Databricks App. The following snippet assumesSERVING_ENDPOINT
is still set - if not, replace it with your serving endpoint name:Bashdatabricks apps create --json '{
"name": "my-agent-chatbot",
"resources": [
{
"name": "serving-endpoint",
"serving_endpoint": {
"name": "'"$SERVING_ENDPOINT"'",
"permission": "CAN_QUERY"
}
}
]
}' -
Upload the source code to Databricks and deploy the app by running the following commands from the
e2e-chatbot-app
directory:BashDATABRICKS_USERNAME=$(databricks current-user me | jq -r .userName)
databricks sync . "/Users/$DATABRICKS_USERNAME/e2e-chatbot-app"
databricks apps deploy my-agent-chatbot --source-code-path "/Workspace/Users/$DATABRICKS_USERNAME/e2e-chatbot-app" -
Get the URL of your app and test the app:
Bashdatabricks apps get my-agent-chatbot | jq -r '.url'
Share the app
After testing your app, you can grant other users permission to view it. See Configure permissions for your Databricks app.
Share your app URL with others so they can chat with your agent and provide feedback.