Dashboard developer workflows
AI/BI dashboards support programmatic and DevOps-oriented workflows for managing dashboards at scale. You can manage dashboards as code using Declarative Automation Bundles and REST APIs, transfer dashboards across workspaces using import and export, and apply source control using Databricks Git folders.
Capability | Description |
|---|---|
Manage dashboards as code using Declarative Automation Bundles or Terraform. Automate creation, updates, and sharing using REST APIs. Schedule routine dashboard refreshes with Lakeflow Jobs. | |
Export dashboards as portable | |
Version-control dashboard files using Databricks Git folders. Implement CI/CD workflows to develop dashboards in branches and deploy them across environments. |
Manage dashboards with Declarative Automation Bundles
To learn how to manage an AI/BI dashboard using Declarative Automation Bundles, see dashboard. For an example bundle that defines a dashboard, see the bundle-examples GitHub repository.
Databricks also offers a Terraform provider. See the Databricks Terraform documentation.
Manage dashboards with REST APIs
See Use Databricks APIs to manage dashboards for tutorials that demonstrate how to use Databricks REST APIs to manage dashboards. The included tutorials explain how to convert legacy dashboards into Lakeview dashboards, and how to create, manage, and share them.
Schedule updates using Lakeflow Jobs
You can configure a task to routinely update an existing published dashboard. To learn more about orchestrating workflows with Lakeflow Jobs, see Lakeflow Jobs. To learn how to configure a dashboard task, see Dashboard task for jobs.
Schedule and subscriber lists that you create using the dashboard UI or API are distinct from scheduling and automation associated with a job. See Automating jobs with schedules and triggers.