Skip to main content

Jobs with a large number of tasks

You can create jobs that contain up to 1000 tasks. The following topics describe how to solve specific issues that may come up with large jobs that have more than 100 tasks.

Get an error that only 100 tasks are allowed

If you receive an error like Resources with more than 100 tasks can only be handled by API 2.2 and above. Your resource has 200 tasks then you must update your Databricks SDK or Databricks CLI.

The following are the minimum SDK versions required, by language:

SDK Language

Version

Go

0.60.0

Python

0.45.0

Java

0.42.0

For information on using and upgrading SDKs, see Use SDKs with Databricks.

Your Databricks CLI must be at least version 0.244.0. For information about installing and updating the Databricks CLI, see Install or update the Databricks CLI.

Get an error that only 150 execution contexts are allowed

If you receive an error like Too many execution contexts are open right now. (Limit set to 150), then your job is trying to run too many tasks on a single compute cluster. You must split the tasks across multiple clusters. For more information, see Configure compute for jobs.

The matrix view for job runs slows down or doesn't show individual tasks

If you are viewing a large number of tasks in the matrix view, for example, a job with 500 tasks, viewing 100 job runs, the UI may slow down. In extreme cases, the view may show only the summary for each job, rather than details by task.

You can filter your views to a shorter time span to see a smaller number of tasks at a single time. This can improve speed and allow the view to show details for all tasks.

For more information about the matrix view, see View runs for a single job.