Skip to main content

Access parameter values from a task

This article describes how to access parameter values from code in your tasks, including Databricks notebooks, Python scripts, and SQL files.

Parameters include user-defined parameters, values output from upstream tasks, and metadata values generated by the job. See Parameterize jobs.

While the details vary by task type, there are four common methods used to reference parameter values from source code:

In each of these cases, you reference the parameter's key to access its value. The key is sometimes referred to as the name of the parameter.

Use dbutils in code in a notebook

Notebook code running in a task can access the values of parameters with the dbutils library. The following example shows how to use dbutils in Python to get the value of a year_param task parameter that's passed into the notebook task.

Python
# Retrieve a job-level parameter
year_value = dbutils.widgets.get("year_param")

# Use the value in your code
display(babynames.filter(babynames.Year == year_value))

Parameters are accessed by name. If you have task parameters and job parameters that have the same name, the job parameters will be fetched.

The above code produces an error when run in a standalone notebook and not as part of a job because parameters are not sent into a standalone notebook. You can set a default for the year_param parameter with the following code:

Python
# Set a default (for when not running in a job)
dbutils.widgets.text("year_param", "2012", "Year Parameter")

# Retrieve a job-level parameter (will use default if it doesn't exist)
year_value = dbutils.widgets.get("year_param")

# Use the value in your code
display(babynames.filter(babynames.Year == year_value))

While this is helpful for testing outside of a job, it also has the drawback of hiding when task or job parameters are not set up correctly.

Use named parameters in a SQL notebook

When you are running SQL in a notebook task, you can use the named parameter syntax to access task parameters. For example, to access a task parameter called year_param, you could get its value by using :year_param in your query:

SQL
SELECT *
FROM baby_names_prepared
WHERE Year_Of_Birth = :year_param
GROUP BY First_Name

Access as code arguments

For some task types parameters are passed to the code as arguments. The following task types have arguments passed to them:

  • Python script
  • Python Wheel
  • JAR
  • Spark Submit

For details, see Details by task type, later in this article.

For dbt tasks, the parameters are passed by calling dbt commands in your task.

Use dynamic value references when configuring a task

When you are configuring a task in the Databricks UI, use the dynamic value reference syntax to access job parameters or other dynamic values. To access job parameters, use the syntax: {{job.parameters.<name>}}. For example, when configuring a Python wheel task, you can set a parameter's Key and Value inputs to reference a Job parameter called year_param, such as year / Year_{{job.parameters.year_param}}. Besides providing access to parameters in configuration, dynamic values also give you access to other data about your job or task, for example, the {{job.id}}. You can click {} in task configuration to get a list of possible dynamic values, and insert them into your configuration.

Details by task type

Which of these methods that you use depends on the task type.

Task type

Access in configuration

Access in code

Notebooks

You can use dynamic value references in the Databricks UI to configure the notebook (for example, to refer to job parameters in the task parameter values). You can override or add additional parameters when you manually run a task using the Run a job with different parameters option.

You can use either named parameters for SQL in your notebook, or dbutils.widgets in your code.

Python script

Parameters defined in the task are passed as arguments to your script. You can use dynamic value references in the Parameters text box.

The parameters can be read as positional arguments or parsed using the argparse module in Python.

Python wheel

Parameters defined in the task definition are passed as keyword arguments to your code. Your Python wheel files must be configured to accept keyword arguments. You can use dynamic value references in the values of your parameters.

Access as keyword arguments to your script. To see an example of reading arguments in a Python script packaged in a Python wheel file, see Use a Python wheel file in a Databricks job.

SQL

You can use dynamic value references in your task configuration.

Use named parameters to access parameter values.

Pipeline

Pipelines do not support passing parameters to the task.

Not supported.

dbt

You can use dynamic value references to pass parameters as dbt commands when configuring your task.

Access as dbt commands.

JAR

You can use dynamic value references to pass parameters as arguments in the Parameters text box when configuring your task.

Parameters are accessed as arguments to the main method of the main class.

Spark Submit

You can use dynamic value references to pass parameters as arguments in the Parameters text box when configuring your task.

Parameters are accessed as arguments to main method of the main class.

Run Job

You can use dynamic value references to create a set of Job Parameters when configuring your task. The values can include dynamic value references.

Not applicable.

If/else condition

You can use dynamic value references when configuring your task, for example, in the Condition.

Not applicable.

For each

You can use dynamic value references when configuring the Inputs for your task. The nested task receives one input as a task parameter for each iteration of the nested task.

The nested tasks access parameters, based on the task type.