Problem: Maximum Execution Context or Notebook Attachment Limit Reached

Problem

Notebook or job execution stops and returns either of the following errors:

Run result unavailable: job failed with error message
Context ExecutionContextId(1731742567765160237) is disconnected.
Can’t attach this notebook because the cluster has reached the attached notebook limit. Detach a notebook and retry.

Cause

When you attach a notebook to a cluster, Databricks creates an execution context. If there are too many notebooks attached to a cluster or too many jobs are created, at some point the cluster reaches its maximum threshold limit of 145 execution contexts, and Databricks returns an error.

Solution

Configure context auto-eviction, which allows Databricks to remove (evict) idle execution contexts. Additionally, from the pipeline and ETL design perspective, you can avoid this issue by using:

  • Fewer notebooks to reduce the number of execution contexts that are created.
  • A job cluster instead of an interactive cluster. If the use case permits, submit notebooks or jars as jobs.