Secrets

A secret is a key-value pair that stores secret material, with a key name unique within a secret scope. Each scope is limited to 1000 secrets. The maximum allowed secret value size is 128 KB.

Create a secret

Secret names are case insensitive.

To create a secret in a Databricks-backed scope using the Databricks CLI (version 0.7.1 and above):

databricks secrets put --scope <scope-name> --key <key-name>

An editor opens and displays content like this:

# ----------------------------------------------------------------------
# Do not edit the above line. Everything that follows it will be ignored.
# Please input your secret value above the line. Text will be stored in
# UTF-8 (MB4) form and any trailing new line will be stripped.
# Exit without saving will abort writing secret.

Paste your secret value above the line and save and exit the editor. Your input is stripped of the comments and stored associated with the key in the scope.

If you issue a write request with a key that already exists, the new value overwrites the existing value.

You can also provide a secret from a file or from the command line. For more information about writing secrets, see Secrets CLI.

List secrets

To list secrets in a given scope:

databricks secrets list --scope <scope-name>

The response displays metadata information about the secret, such as the secret key name and last updated at timestamp (in milliseconds since epoch). You use the Secrets utility (dbutils.secrets) in a notebook or job to read a secret. For example:

databricks secrets list --scope jdbc
Key name    Last updated
----------  --------------
password    1531968449039
username    1531968408097

Read a secret

You create secrets using the REST API or CLI, but you must use the Secrets utility (dbutils.secrets) in a notebook or job to read a secret.

Secret paths in a Spark configuration property or environment variable

Preview

This feature is in Public Preview.

Note

Available in Databricks Runtime 6.4 and above.

You can store the path to a secret in a Spark configuration property or environment variable. Retrieved secrets are redacted from notebook output and Spark driver and executor logs.

Important

Secrets are not redacted from the Spark driver log stdout and stderr streams. By default, Spark driver logs are viewable by users with the cluster level permission levels Can Attach To, Can Restart, or Can Manage. To protect secrets that might appear in those driver log streams such that only users with the Can Manage permission on the cluster can view them, set the cluster’s Spark configuration property spark.databricks.acl.needAdminPermissionToViewLogs true

Requirements and limitations

The following requirements and limitations apply to storing secret paths.

  • Cluster owners must have Can Read permission on the secret scope.
  • Only cluster owners can add a path to a secret in a Spark configuration property or environment variable and edit the existing scope and name. Owners change a secret using the Put secret API. You must restart your cluster to fetch the secret again.
  • Users with the Can Manage permission on the cluster can delete a secret Spark configuration property or environment variable.

Path value

The syntax of the Spark configuration property or environment variable path value must be {{secrets/<scope-name>/<secret-name>}}. The value must start with {{secrets/ and end with }}.

The variable portions of the Spark configuration property or environment variable are:

  • <scope-name>: The name of the scope in which the secret is associated.
  • <secret-name>: The unique name of the secret in the scope.

For example, {{secrets/scope1/key1}}.

Note

  • There should be no spaces between the curly brackets. If there are spaces, they are treated as part of the scope or secret name.
  • If the value format uses incorrect syntax, such as only one starting brace or ending brace, the value is treated as a Spark configuration property or environment variable value.

Retrieve a Spark configuration property from a secret

You specify a secret path in a Spark configuration property in the following format:

spark.<secret-prop-name> <path-value>

spark.<secret-prop-name> is a Spark configuration property name that maps to the secret path. You can add multiple secrets to the Spark configuration as long as the secret property names are unique.

Example

spark.password {{secrets/scope1/key1}}

To fetch the secret in the notebook and use it:

Python:

spark.conf.get("spark.password")

SQL:

%sql SELECT ${spark.password};

Store the path to a secret in an environment variable

You specify a secret path in an environment variable and use it in a cluster-scoped init script. These environment variables are not accessible from a program running in Spark.

SPARKPASSWORD=<path-value>

To fetch the secret in an init script, access $SPARKPASSWORD:

if [[ ${SPARKPASSWORD} ]]; then
  use ${SPARKPASSWORD}
fi

Delete a secret

To delete a secret from a scope with the Databricks CLI:

databricks secrets delete --scope <scope-name> --key <key-name>

All subsequent requests for a given key using the Databricks Utilities secrets utility will fail once the secret is deleted.

You can also use the Secrets API.