Develop Databricks Apps

Preview

Databricks Apps is in Public Preview.

Note

To deploy and run apps in your Databricks workspace, you must ensure that your firewall does not block the domain *.databricksapps.com.

This article has details for creating data and AI apps, including how to create and edit apps in the UI, how to use Databricks platform features such as SQL warehouses, secrets, and Databricks Jobs, best practices for developing your apps, and important information for developing apps using supported frameworks.

How do I create an app in the Databricks Apps UI?

Note

  • The name assigned to a Databricks app cannot be changed after creating the app, and any user with access to a Databricks workspace can see the names and deployment history of all Databricks apps in the workspace. Additionally, the app name is included in records written to system tables. Because of this visibility, you should not use sensitive information when naming your Databricks apps.

  • Because the name of a Databricks app is used to construct the link to the deployed app, the name must use only characters that are valid in a URL.

  1. In the sidebar, click New Icon New and select App from the menu.

  2. You can start with a complete, pre-built example app or use your source code and artifacts.

    • To start with an example app, select Template, click the tab for your preferred framework, and select from the list of apps.

    • To create an app using your code, select Custom.

  3. Click Next.

  4. In the App name field, enter a name for the app.

  5. (Optional) Enter a description for the app.

  6. If you selected Template, configure required resources and click Create and deploy app. See Assign Databricks platform features to a Databricks app. If you selected Custom, click Create app.

The app details page appears after you click Create and shows the app’s creation and deployment status. The details page also includes steps for local development of the app, including syncing local code back to the Databricks workspace.

To copy your app artifacts from the workspace to your local development environment, you can use the Databricks CLI:

workspace export-dir <workspace-path> <target-path>

Replace:

  • <workspace-path> with the path to the workspace files directory that contains your app code and artifacts.

  • <target-path> with a path in your local environment to copy the files to.

Assign Databricks platform features to a Databricks app

Note

To use Databricks SQL, service principals require access to a SQL warehouse and any tables accessed by queries.

To use features of the Databricks platform such as Databricks SQL, Databricks Jobs, Mosaic AI Model Serving, and Databricks secrets, add these features to your app as resources. You can add resources when you create or edit an app.

  1. In the Create new app or Edit app card, click Advanced settings > + Add resource, and select the resource type.

    Adding a SQL warehouse as an App resource in the UI
  2. Depending on the resource type, complete the fields required to configure the resource, including the Resource key field. This key is used later to reference the resource.

  3. Click Save.

  4. Add an entry for the resource in the app’s`app.yaml` configuration file. Use an entry like the following for the SQL warehouse resource key in this example. Because this is referencing the source of the parameter value and not the actual value, use valueFrom instead of value.

    env:
      - name: "DATABRICKS_WAREHOUSE_ID"
        valueFrom: "sql-warehouse"
    
  5. To reference the resource in your app code, use the value of the name field (DATABRICKS_WAREHOUSE_ID in this example) to refer to the configured key value.

    import os
    
    os.getenv('DATABRICKS_WAREHOUSE_ID')
    

To see more examples of using resources with apps, including SQL warehouses and model serving endpoints, see the template examples when you create an app and Best practice: Use secrets to store sensitive information for a Databricks app.

Configure permissions for your Databricks app

To manage the app’s permissions, you must have the CAN MANAGE or IS OWNER permission.

  1. On the app details page, click Permissions.

  2. In Permissions Settings, select the Select User, Group or Service Principal… drop-down menu and then select a user, group, service principal, or all workspace users.

  3. Select a permission from the permission drop-down menu.

  4. Click Add.

  5. Click Save.

Maintaining state for your Databricks app

Any state your app maintains in memory is lost when it restarts. If your app requires maintaining state between restarts, store the state externally. For example, your app can use Databricks SQL, workspace files, or Unity Catalog volumes to persist state.

Logging from your Databricks app

Note

To view logs in the Databricks Apps UI or with the app URL, your app must log to stdout and stderr.

To view the standard output and standard error for an app, on the details page for the app, click the Logs tab.

You can also view the the standard output and standard error logs at the <appurl>/logz link. For example, if the URL for your app is https://my-app-1234567890.my-instance.databricksapps.com, then you can view the logs at https://my-app-1234567890.my-instance.databricksapps.com/logz.

You can find the app URL on the app details page.

Specifying library dependencies for your Databricks app

If your app requires Python libraries other than the automatically installed packages installed with your deployment, use a requirements.txt file to define those libraries. If a package in your requirements.txt file duplicates one of the automatically installed packages, the version in your requirements.txt overrides the automatically installed package.

For the list of packages and versions installed as part of your app deployment, see Installed Python libraries.

What HTTP headers are passed to Databricks apps?

The following X-Forwarded-* headers are passed from the Databricks Apps reverse proxy to apps:

Header

Description

X-Forwarded-Host

The original host or domain requested by the client.

X-Forwarded-Preferred-Username

The user name provided by IdP.

X-Forwarded-User

The user identifier provided by IdP.

X-Forwarded-Email

The user email provided by IdP.

X-Real-Ip

The IP address of the client that made the original request.

X-Request-Id

The UUID of the request.

What frameworks are supported by Databricks Apps?

You can use most Python frameworks to develop your apps. To see examples of using specific frameworks, including Dash, Gradio, and Streamlit, select from the library of template apps when creating a new app in the UI. See How do I create an app in the Databricks Apps UI?.

For Streamlit-specific variables that are set in the Databricks Apps runtime environment, see Default environment variables for Streamlit.

Best practice: Use secrets to store sensitive information for a Databricks app

Databricks recommends using secrets to store sensitive information such as authentication credentials. To learn more about using secrets, see Secrets.

To use a secret with your app:

  1. Configure the secret as an app resources.

    Adding a secret as an App resource in the UI
  2. Add an entry for the secret in the app’s app.yaml configuration file.

    env:
      - name: "API_TOKEN"
        valueFrom: "api-token-value"
    
  3. To reference the secret in your app code, use the value of the name field (API_TOKEN in this example) to refer to the configured key value.

token = os.getenv('API_TOKEN')

Best practice: Use Databricks features for data processing

Databricks Apps compute is designed for serving UI. To ensure your apps can efficiently support multiple users, you should use Databricks features to perform anything other than simple data processing. For example, use Databricks SQL for query processing and storing datasets, Databricks jobs for data processing, or model serving to query AI models.

Best practice: Follow secure coding best practices

Databricks recommends following secure coding practices when developing your apps, including parameterizing queries to avoid SQL injection attacks. See the statement execution API.

Important guidelines for implementing Databricks apps

  • Databricks Apps send a SIGKILL signal 15 seconds after a SIGTERM, so apps should gracefully shut down no more than 15 seconds after receiving the SIGTERM signal. If an app has not exited after 15 seconds, a SIGKILL signal is sent to terminate the process and all child processes.

  • Because Databricks Apps are run as a non-privileged system user, they cannot perform operations that require running in a privileged security context such as operations requiring root user permissions.

  • Requests are forwarded from a reverse proxy, so apps must not depend on the requests’ origins. The Databricks Apps environment sets the required configuration parameters for supported frameworks.

  • Because the Databricks app framework manages Transport Layer Security (TLS) connections, your apps must not perform any TLS connection or handshake operations.

  • Your apps must be implemented to handle requests in HTTP/2 cleartext (H2C) format.

  • Databricks apps must host HTTP servers on 0.0.0.0 and use the port number specified in the DATABRICKS_APP_PORT environment variable. See environment variables.