Databricks Apps
Databricks Apps enables developers to build and deploy secure data and AI applications directly on the Databricks platform, which eliminates the need for separate infrastructure. Apps are hosted on the Databricks serverless platform and integrate with key platform services, including the following:
- Unity Catalog for data governance
- Databricks SQL for querying data
- Model Serving for deploying AI models
- Databricks Jobs for ETL and automation
- OAuth and service principals for authentication and authorization
You can develop your apps locally, deploy them to a workspace, and move them between workspaces. This hosting model eliminates the need for developers to handle security, infrastructure, and compliance, which simplifies the process of bringing internal data tools to production.
Databricks Apps supports Python frameworks like Streamlit, Dash, and Gradio. For examples that use popular Python frameworks in the Databricks Apps UI, see How do I create an app in the Databricks Apps UI?.
For information about Databricks Apps pricing, see Compute for Apps.
Common use cases
Databricks apps work well for internal tools that combine data, AI, and automation. Example use cases include:
- Interactive data visualizations and embedded Business Intelligence (BI) dashboards
- Retrieval-Augmented Generation (RAG) chat apps powered by Genie
- Custom configuration interfaces for Lakeflow
- Data entry forms backed by Databricks SQL
- Business process automation combining multiple Databricks services
- Custom ops tools for alert triage and response
Requirements
To build, deploy, and run Databricks apps, your environment must meet specific prerequisites. These include requirements for both your Databricks workspace and your local development environment.
Workspace requirements
To deploy and run apps in your Databricks workspace, make sure the workspace meets the following requirements:
- The firewall allows access to the
*.databricksapps.com
domain. - It's located in a region that supports serverless compute, as Databricks Apps relies on this infrastructure. See Serverless compute feature availability.
Development environment requirements
To create apps locally and deploy them to your Databricks workspace, your development environment must meet the following requirements:
-
Python version 3.11 or above installed.
-
Databricks CLI, v0.229.0 or above, configured to access your Databricks workspace. To install or update the CLI, see Install or update the Databricks CLI.
Databricks recommends using OAuth user-to-machine (U2M) authentication. See Authentication for the Databricks CLI.
-
Databricks SDK for Python installed. Install with:
pip3 install databricks-sdk
-
(Optional) Databricks SQL Connector for Python installed, if your app needs to access Databricks SQL. Install with:
pip3 install databricks-sql-connector
Limitations
- A Databricks workspace can host up to 50 apps.
- App files can't exceed 10 MB. If any file in the app directory exceeds this limit, deployment fails with an error.
- Databricks Apps isn't compliant with HIPAA, PCI, or FedRAMP standards.
- Databricks deletes app logs when the compute resource running the app is terminated. See Logging from your Databricks app.
- If you grant consent to an app through user authorization, you can't revoke that consent later.
Databricks Apps system environment
To view the environment for a specific app, including environment variables and installed packages, go to the Environment tab on the details page for the app. See View the details for a Databricks app.
The following describes the system environment in which your apps run, including available resources and pre-installed software versions:
- Operating System: Ubuntu 22.04 LTS
- Python environment: Python 3.11.0, running in a dedicated virtual environment. All dependencies are isolated within this environment, including libraries you install using a
requirements.txt
file and pre-installed libraries. - System resources: Each app can use up to 2 virtual CPUs (vCPUs) and 6 GB of memory. If your app exceeds these limits, Databricks might restart it.
Installed Python libraries
The following Python libraries are pre-installed in the Databricks app environment. You don’t need to include them in your app unless you require a different version.
Library | Version |
---|---|
databricks-sql-connector | 3.4.0 |
databricks-sdk | 0.33.0 |
mlflow-skinny | 2.16.2 |
gradio | 4.44.0 |
streamlit | 1.38.0 |
shiny | 1.1.0 |
dash | 2.18.1 |
flask | 3.0.3 |
fastapi | 0.115.0 |
uvicorn[standard] | 0.30.6 |
gunicorn | 23.0.0 |
dash-ag-grid | 31.2.0 |
dash-mantine-components | 0.14.4 |
dash-bootstrap-components | 1.6.0 |
plotly | 5.24.1 |
plotly-resampler | 0.10.0 |