A secret is a key-value pair that stores secret material, with a key name unique within a secret scope. Each scope is limited to 1000 secrets. The maximum allowed secret value size is 128 KB.

Create a secret

Secret names are case insensitive.

To create a secret in a Databricks-backed scope using the Databricks CLI (version 0.7.1 and above):

databricks secrets put --scope <scope-name> --key <key-name>

An editor opens and displays content like this:

# ----------------------------------------------------------------------
# Do not edit the above line. Everything that follows it will be ignored.
# Please input your secret value above the line. Text will be stored in
# UTF-8 (MB4) form and any trailing new line will be stripped.
# Exit without saving will abort writing secret.

Paste your secret value above the line and save and exit the editor. Your input is stripped of the comments and stored associated with the key in the scope.

If you issue a write request with a key that already exists, the new value overwrites the existing value.

You can also provide a secret from a file or from the command line. For more information about writing secrets, see Secrets CLI.

List secrets

To list secrets in a given scope:

databricks secrets list --scope <scope-name>

The response displays metadata information about the secret, such as the secret key name and last updated at timestamp (in milliseconds since epoch). You use the Secrets utilities in a notebook or job to read a secret. For example:

databricks secrets list --scope jdbc
Key name    Last updated
----------  --------------
password    1531968449039
username    1531968408097

Read a secret

You create secrets using the REST API or CLI, but you must use the Secrets utilities in a notebook or job to read a secret.

Secret paths in Spark configuration properties and environment variables


This feature is in Public Preview.


Available in Databricks Runtime 6.1 and above.

You can store the path to a secret in a Spark configuration property or environment variable. Retrieved secrets are redacted from notebook output and Spark driver and executor logs.


Secrets are not redacted from stdout and stderr. A workaround is to set the Spark configuration property spark.databricks.acl.needAdminPermissionToViewLogs true to allow only users who have manage permission to view the stdout page.

Requirements and limitations

  • Cluster owners must have Read permission on the secret scope.
  • Only cluster owners can add a path to a secret in a Spark configuration or environment variable and edit the existing scope and name. Owners change a secret using the Put secret API. You must restart your cluster to fetch the secret again.
  • Users with the Can Manage permission on the cluster can delete secret properties and environment variables.

Path value

The syntax of the Spark property or environment variable path value must be {{secrets/<scope-name>/<secret-name>}}.

The value must start with {{secrets/ and end with }}. The variable portions of the property or environment variable are:

  • <secret-prop-name>: The name of the secret property in the Spark configuration.
  • <scope-name>: The name of the scope in which the secret is associated.
  • <secret-name>: The unique name of the secret in the scope.


  • There should be no spaces between the curly brackets. If there are spaces, they are treated as part of the scope or secret name.
  • If the value format is incorrect, for example, there is only one starting brace or ending brace, the value is treated as a Spark configuration property or environment variable value.

Store the path to a secret in a Spark configuration property

You specify a secret path in a AWS configurations in the following format:

spark.<secret-prop-name> <path-value>

spark.<secret-prop-name> is a Spark configuration property name that maps to the secret path. You can add multiple secrets to the Spark configuration as long as the secret property names are unique.


spark.password {{secrets/testScope/testKey1}}

To fetch the secret in the notebook and use it, run spark.conf.get("spark.<secret-name>"):


Store the path to a secret in an environment variable

You specify a secret path in an environment variable and use it in a cluster-scoped init script. These environment variables are not accessible from a program running in Spark.


To fetch the secret in an init script, access $SPARKPASSWORD:

if [[ $SPARKPASSWORD ]]; then

Delete a secret

To delete a secret from a scope:

databricks secrets delete --scope <scope-name> --key <key-name>

All subsequent requests for a given key using the Databricks Utilities secret utilities interface will fail once the secret is deleted.