Disabled tasks in Lakeflow Jobs
A disabled task in a Databricks Lakeflow job is skipped at run time without being removed from the job. Disabled tasks keep their configuration and run history, so you can re-enable them later without rebuilding the task. This page explains how disabled tasks behave when a job runs, including their effect on downstream tasks, repairs, and partial runs.
Downstream task behavior
When a job runs, Lakeflow Jobs evaluates each downstream task's Run if condition against its upstream tasks to decide whether to run, skip, or disable the task. Disabled tasks complete with a Disabled termination code.
If a downstream task's Run if condition can't be met because one or more parent tasks are disabled, Lakeflow Jobs also marks the downstream task as disabled for that run. Disabled downstream tasks show a in the upper-right corner of the Directed Acyclic Graph (DAG) view, so you can see the impact before starting a run.
The following table summarizes downstream behavior for each Run if condition when an upstream task is disabled. For the full list of Run if options, see Configure task dependencies.
Run if condition | Downstream task behavior when a parent task is disabled | Example |
|---|---|---|
All succeeded (default) | The downstream task does not run. A disabled parent task does not satisfy the |
|
At least one succeeded | The downstream task runs if at least one other parent task succeeded. If all other parent tasks failed or were disabled, the downstream task does not run. |
|
None failed | The downstream task runs if at least one parent task completed without failure. If all parent tasks are disabled, the downstream task does not run. |
|
All done | The downstream task runs normally. A disabled parent task is treated as done. |
|
At least one failed | The downstream task runs if at least one other parent task failed. A disabled parent task is not treated as a failure. If no other parent task failed, the downstream task does not run. |
|
All failed | The downstream task does not run. A disabled parent task is not treated as a failure. |
|
Only tasks that you explicitly disable have disabled: true in the job definition. Lakeflow Jobs determines downstream disablement at run creation time and doesn't persist it in the job settings.
Disable a task
To disable or re-enable a task using the UI, see Disable a task.
To disable a task through the API or a bundle:
Set disabled: true on the task in the job settings using the Jobs REST API, the Databricks CLI, the Databricks SDK, or a Declarative Automation Bundles:
{
"tasks": [
{
"task_key": "load_raw_data",
"disabled": true,
"notebook_task": {
"notebook_path": "/Shared/etl/load_raw_data"
}
}
]
}
The jobs/get and jobs/list responses return disabled: true only for tasks that you explicitly disabled. Tasks disabled dynamically during a run aren't reflected in the stored job settings.
Disabled tasks in repairs and partial runs
Disabled tasks behave differently in repair runs and partial runs than in a typical scheduled run:
- Repairs: Lakeflow Jobs uses the run state of each task to determine what to repair, not the task's disabled state. To force a disabled task to run as part of a repair, include it in
rerun_tasksin the repair request. See Re-run failed and skipped tasks. - Partial runs: Disabled tasks aren't selected by default when you start a partial run, but you can select them to run once without re-enabling them in the job settings. Lakeflow Jobs runs exactly the tasks you select and doesn't apply Run if propagation during a partial run.
Limitations
Disabled tasks have the following limitations:
- An
If/else conditiontask fails if the upstream task that provides its condition value is disabled. - A
For eachtask fails if the upstream task that provides its input values is disabled. - Only user-disabled tasks appear as
disabled: truein the job definition. To see which downstream tasks are affected before running the job, use the DAG view in the Jobs UI.