View and manage job runs
This article describes the features available in the Databricks UI to view jobs you have access to, view a history of runs for a job, and view details of job runs. To learn about using the Databricks CLI to view jobs and run jobs, see Jobs CLI. To learn about using the Jobs API, see the Jobs API.
View jobs
Click Workflows in the sidebar. The Jobs list appears. The Jobs page lists all defined jobs, the cluster definition, the schedule, if any, and the result of the last run.
Note
If you have the increased jobs limit enabled for this workspace, only 25 jobs are displayed in the Jobs list to improve the page loading time. Use the left and right arrows to page through the full list of jobs.
You can filter jobs in the Jobs list:
Using keywords. If you have the increased jobs limit feature enabled for this workspace, searching by keywords is supported only for the name, job ID, and job tag fields.
Selecting only the jobs you own.
Selecting all jobs you have permissions to access. Access to this filter requires that Jobs access control is enabled.
Using tags. To search for a tag created with only a key, type the key into the search box. To search for a tag created with a key and value, you can search by the key, the value, or both the key and value. For example, for a tag with the key
department
and the valuefinance
, you can search fordepartment
orfinance
to find matching jobs. To search by both the key and value, enter the key and value separated by a colon; for example,department:finance
.
You can also click any column header to sort the list of jobs (either descending or ascending) by that column. When the increased jobs limit feature is enabled, you can sort only by Name
, Job ID
, or Created by
. The default sorting is by Name
in ascending order.
View runs for a job
You can view a list of currently running and recently completed runs for all jobs you have access to, including runs started by external orchestration tools such as Apache Airflow or Azure Data Factory. To view the list of recent job runs:
Click
Workflows in the sidebar.
In the Name column, click a job name. The Runs tab appears with matrix and list views of active runs and completed runs.
The matrix view shows a history of runs for the job, including each job task.
The Run total duration row of the matrix displays the total duration of the run and the state of the run. To view details of the run, including the start time, duration, and status, hover over the bar in the Run total duration row.
Each cell in the Tasks row represents a task and the corresponding status of the task. To view details of each task, including the start time, duration, cluster, and status, hover over the cell for that task.
The job run and task run bars are color-coded to indicate the status of the run. Successful runs are green, unsuccessful runs are red, and skipped runs are pink. The height of the individual job run and task run bars provides a visual indication of the run duration.
The runs list view displays:
The start time for the run.
The run identifier.
Whether the run was triggered by a job schedule or an API request, or was manually started.
The time elapsed for a currently running job, or the total running time for a completed run.
Links to the Spark logs.
The status of the run, either
Pending
,Running
,Skipped
,Succeeded
,Failed
,Terminating
,Terminated
,Internal Error
,Timed Out
,Canceled
,Canceling
, orWaiting for Retry
.
To change the columns displayed in the runs list view, click Columns and select or deselect columns.
To view details for a job run, click the link for the run in the Start time column in the runs list view. To view details for the most recent successful run of this job, click Go to the latest successful run.
Databricks maintains a history of your job runs for up to 60 days. If you need to preserve job runs, Databricks recommends that you export results before they expire. For more information, see Export job run results.
View job run details
The job run details page contains job output and links to logs, including information about the success or failure of each task in the job run. You can access job run details from the Runs tab for the job. To view job run details from the Runs tab, click the link for the run in the Start time column in the runs list view. To return to the Runs tab for the job, click the Job ID value.
If the job contains multiple tasks, click a task to view task run details, including:
the cluster that ran the task
the Spark UI for the task
logs for the task
metrics for the task
Click the Job ID value to return to the Runs tab for the job.
View task run history
To view the run history of a task, including successful and unsuccessful runs:
Click on a task on the Job run details page. The Task run details page appears.
Select the task run in the run history dropdown menu.
View recent job runs
You can view a list of currently running and recently completed runs for all jobs in a workspace that you have access to, including runs started by external orchestration tools such as Apache Airflow or Azure Data Factory. To view the list of recent job runs:
Click
Workflows in the sidebar. The Jobs list appears.
Click the Job runs tab to display the Job runs list.
The Job runs list displays:
The start time for the run.
The name of the job associated with the run.
The user name that the job runs as.
Whether the run was triggered by a job schedule or an API request, or was manually started.
The time elapsed for a currently running job, or the total running time for a completed run.
The status of the run, either
Pending
,Running
,Skipped
,Succeeded
,Failed
,Terminating
,Terminated
,Internal Error
,Timed Out
,Canceled
,Canceling
, orWaiting for Retry
.Any parameters for the run.
To view job run details, click the link in the Start time column for the run. To view job details, click the job name in the Job column.
View lineage information for a job
If Unity Catalog is enabled in your workspace, you can view lineage information for any Unity Catalog tables in your workflow. If lineage information is available for your workflow, you will see a link with a count of upstream and downstream tables in the Job details panel for your job, the Job run details panel for a job run, or the Task run details panel for a task run. Click the link to show the list of tables. Click a table to see detailed information in Data Explorer.
Export job run results
You can export notebook run results and job run logs for all job types.
Export notebook run results
You can persist job runs by exporting their results. For notebook job runs, you can export a rendered notebook that can later be imported into your Databricks workspace.
To export notebook run results for a job with a single task:
On the job detail page, click the View Details link for the run in the Run column of the Completed Runs (past 60 days) table.
Click Export to HTML.
To export notebook run results for a job with multiple tasks:
On the job detail page, click the View Details link for the run in the Run column of the Completed Runs (past 60 days) table.
Click the notebook task to export.
Click Export to HTML.
Export job run logs
You can also export the logs for your job run. You can set up your job to automatically deliver logs to DBFS or S3 through the Job API. See the new_cluster.cluster_log_conf
object in the request body passed to the Create a new job operation (POST /jobs/create
) in the Jobs API.