Develop Databricks Apps
Preview
Databricks Apps is in Public Preview.
Note
To deploy and run apps in your Databricks workspace, you must ensure that your firewall does not block the domain *.databricksapps.com
.
This article has details for creating data and AI apps with Databricks Apps, including how to create and edit apps in the UI, how to use Databricks platform features such as SQL warehouses, secrets, and Databricks Jobs, best practices for developing your apps, and important information for developing apps using supported frameworks.
How do I create an app in the Databricks Apps UI?
In the sidebar, click New and select App from the menu.
You can start with a complete, pre-built example app or use your source code and artifacts.
To start with an example app, select Template, click the tab for your preferred framework, and select from the list of apps.
To create an app using your code, select Custom.
Click Next.
In the App name field, enter a name for the app and optionally enter a description.
Note
The name assigned to a Databricks app cannot be changed after creating the app, and any user with access to a Databricks workspace can see the names and deployment history of all Databricks apps in the workspace. Additionally, the app name is included in records written to system tables. Because of this visibility, you should not include sensitive information when naming your Databricks apps.
The name must be unique in the Databricks workspace that hosts the app and must contain only lowercase letters, numbers, and hyphens.
If you selected Custom, click Create app. If you selected Template, configure the required resources and click Create and deploy app. See Assign Databricks platform features to a Databricks app.
The app details page appears after you click Create and deploy app or Create app. If you selected Template, Databricks Apps creates your app and then deploys the app, including the example code from the template and required configuration for the app. The details page for the app shows:
The status of the app’s creation and deployment.
The steps you can use to continue development of the app locally, including copying the app artifacts to your local environment and syncing local changes back to the Databricks workspace.
If you selected Custom, Databricks Apps creates your app. Because you must add the code and artifacts for your app, you must deploy the app as a separate step. The details page for the app shows:
The status of the app’s creation.
The steps you can use to sync the app’s code and artifacts from your local development environment to your Databricks workspace and deploy the app.
To learn how to set up a local development environment, create or update the code and configuration for your app locally, and sync and deploy the app to your Databricks workspace, see Get started with Databricks Apps.
If you selected Template and want to copy your app artifacts from the workspace to your local development environment, you can use the Databricks CLI:
databricks workspace export-dir <workspace-path> <target-path>
Replace:
<workspace-path>
with the path to the workspace files directory that contains your app code and artifacts.<target-path>
with a path in your local environment to copy the files to.
Assign Databricks platform features to a Databricks app
Note
To use Databricks SQL, service principals require access to a SQL warehouse and any tables accessed by queries.
To use features of the Databricks platform such as Databricks SQL, Databricks Jobs, Mosaic AI Model Serving, and Databricks secrets, add these features to your app as resources. You can add resources when you create or edit an app.
In the Create new app or Edit app card, click Advanced settings > + Add resource, and select the resource type.
Depending on the resource type, complete the fields required to configure the resource, including the Resource key field. This key is used later to reference the resource.
Click Save.
Add an entry for the resource in the
app.yaml
configuration file using the SQL warehouse resource key. Because this is referencing the source of the parameter value and not the actual value, usevalueFrom
instead ofvalue
.env: - name: "DATABRICKS_WAREHOUSE_ID" valueFrom: "sql-warehouse"
To reference the resource in your app code, use the value of the
name
field (DATABRICKS_WAREHOUSE_ID
in this example) to refer to the configured key value.import os os.getenv('DATABRICKS_WAREHOUSE_ID')
To see more examples of using resources with apps, including SQL warehouses and model serving endpoints, see the template examples when you create an app and Best practice: Use secrets to store sensitive information for a Databricks app.
View the details for a Databricks app
To view the details page for a Databricks app:
Click Compute in the sidebar.
Go to the Apps tab.
In the Name column, click the app name.
The Overview tab appears with details for the app, including its status, deployment location, and any associated resources.
To access the app’s deployment history, go to the Deployments tab.
To view the logs for the app, go to the Logs tab.
To view the runtime environment for the app, including environment variables and installed packages, go to the Environment tab.
Configure permissions for your Databricks app
To manage the app’s permissions, you must have the CAN MANAGE
or IS OWNER
permission.
On the app details page, click Permissions.
In Permissions Settings, select the Select User, Group or Service Principal… drop-down menu and then select a user, group, service principal, or all workspace users.
Select a permission from the permission drop-down menu.
Click Add and then click Save.
Maintaining state for your Databricks app
Any state your app maintains in memory is lost when it restarts. If your app requires maintaining state between restarts, store the state externally. For example, your app can use Databricks SQL, workspace files, or Unity Catalog volumes to persist state.
Logging from your Databricks app
Note
To view logs in the Databricks Apps UI or with the app URL, your app must log to stdout
and stderr
.
To view the standard output and standard error for an app, on the details page for the app, click the Logs tab. See View the details for a Databricks app.
You can also view the the standard output and standard error logs at the <appurl>/logz
link. For example, if the URL for your app is https://my-app-1234567890.my-instance.databricksapps.com
, then you can view the logs at https://my-app-1234567890.my-instance.databricksapps.com/logz
. To find the app URL, go to the app details page.
Specifying library dependencies for your Databricks app
If your app requires Python libraries other than the automatically installed packages installed with your deployment, use a requirements.txt
file to define those libraries. If a package in your requirements.txt
file duplicates one of the automatically installed packages, the version in your requirements.txt
overrides the automatically installed package.
For the list of packages and versions installed as part of your app deployment, see Installed Python libraries.
What HTTP headers are passed to Databricks apps?
The following X-Forwarded-*
headers are passed from the Databricks Apps reverse proxy to apps:
Header |
Description |
---|---|
|
The original host or domain requested by the client. |
|
The user name provided by IdP. |
|
The user identifier provided by IdP. |
|
The user email provided by IdP. |
|
The IP address of the client that made the original request. |
|
The UUID of the request. |
What frameworks are supported by Databricks Apps?
You can use most Python frameworks to develop your apps. To see examples of using specific frameworks, including Dash, Gradio, and Streamlit, select from the library of template apps when creating a new app in the UI. See How do I create an app in the Databricks Apps UI?.
For Streamlit-specific variables that are set in the Databricks Apps runtime environment, see Default environment variables for Streamlit.
Best practice: Use secrets to store sensitive information for a Databricks app
Databricks recommends using secrets to store sensitive information, such as authentication credentials. To learn more about using secrets, see Secrets.
To use a secret with your app:
Configure the secret as an app resource.
Add an entry for the secret in the app’s
app.yaml
configuration file.env: - name: "API_TOKEN" valueFrom: "api-token-value"
To reference the secret in your app code, use the value of the
name
field (API_TOKEN
in this example) to refer to the configured key value.
token = os.getenv('API_TOKEN')
Best practice: Use Databricks features for data processing
Databricks Apps compute is designed for serving UI. To ensure your apps can efficiently support multiple users, you should use Databricks features to perform anything other than simple data processing. For example, use Databricks SQL for query processing and storing datasets, Databricks jobs for data processing, or model serving to query AI models.
Best practice: Follow secure coding best practices
Databricks recommends following secure coding practices when developing your apps, including parameterizing queries to avoid SQL injection attacks. See the statement execution API.
Important guidelines for implementing Databricks apps
Databricks Apps send a
SIGKILL
signal 15 seconds after aSIGTERM
, so apps should gracefully shut down no more than 15 seconds after receiving theSIGTERM
signal. If an app has not exited after 15 seconds, aSIGKILL
signal is sent to terminate the process and all child processes.Because Databricks Apps are run as a non-privileged system user, they cannot perform operations that require running in a privileged security context, such as operations requiring root user permissions.
Requests are forwarded from a reverse proxy, so apps must not depend on the origins of the requests. The Databricks Apps environment sets the required configuration parameters for supported frameworks.
Because the Databricks app framework manages Transport Layer Security (TLS) connections, your apps must not perform any TLS connection or handshake operations.
Your apps must be implemented to handle requests in HTTP/2 cleartext (H2C) format.
Databricks apps must host HTTP servers on
0.0.0.0
and use the port number specified in theDATABRICKS_APP_PORT
environment variable. See environment variables.