Run Delta Live Tables pipelines
You run Delta Live Tables pipelines by starting a pipeline update. Most commonly, you run full updates to refresh all of the datasets in a pipeline, but Delta Live Tables offers other update options to support different tasks. For example, you can run an update for only selected tables for testing or debugging.
Updates can be run manually in the Delta Live Tables UI or a Databricks notebook, or as a scheduled task using the REST API or the Databricks CLI. Updates can also be scheduled or included in a workflow using an orchestration tool such as Databricks Jobs or Apache Airflow. The articles in this section detail these options for running your Delta Live Tables pipelines.
Run pipelines manually
To learn how the datasets defined in a pipeline are processed when an update is run, the different types of updates supported, and recommendations for selecting settings for updates, see Run an update on a Delta Live Tables pipeline.
Run pipelines using orchestration tools
To learn how to run a pipeline on a scheduled basis or as part of a larger data processing workflow, see Run a Delta Live Tables pipeline in a workflow.