Share code between Databricks notebooks

Databricks supports several methods for sharing code among notebooks. Each of these permits you to modularize and share code in a notebook, just as you would with a library.

Databricks also supports multi-task jobs, which allow you to combine notebooks into workflows with complex dependencies.

This article describes how to use %run to run a notebook from another notebook, and how to use Databricks Repo to import source code files from a repository into your notebook.

Use %run to import a notebook

The %run command allows you to include another notebook within a notebook. You can use %run to modularize your code, for example by putting supporting functions in a separate notebook. You can also use it to concatenate notebooks that implement the steps in an analysis. When you use %run, the called notebook is immediately executed and the functions and variables defined in it become available in the calling notebook.

In the example below, the first notebook defines a function, reverse, which is available in the second notebook after you use the %run magic to execute shared-code-notebook.

Shared code notebook
Notebook import example

Because both of these notebooks are in the same directory in the workspace, use the prefix ./ in ./shared-code-notebook to indicate that the path should be resolved relative to the currently running notebook. You can organize notebooks into directories, such as %run ./dir/notebook, or use an absolute path like %run /Users/username@organization.com/directory/notebook.

Note

  • %run must be in a cell by itself, because it runs the entire notebook inline.

  • You cannot use %run to run a Python file and import the entities defined in that file into a notebook. To import from a Python file, see Use Databricks Repos to reference source code files. Or, package the file into a Python library, create a Databricks library from that Python library, and install the library into the cluster you use to run your notebook.

  • When you use %run to run a notebook that contains widgets, by default the specified notebook runs with the widget’s default values. You can also pass in values to widgets; see Use Databricks widgets with %run.

Use Databricks Repos to reference source code files

For notebooks stored in a Databricks Repo, you can import source code files from the repository.

For example, if a repo includes a file power.py:

file in Databricks repo

You can import that file into a notebook and call the functions defined in the file:

call file from Databricks notebook

For details on working with source code files in Databricks Repos, see Work with Python and R modules.