Secret workflow example

In this workflow example, we use secrets to set up JDBC credentials for connecting to an Azure Data Lake Store.

Create a secret scope

Create a secret scope called jdbc.

databricks secrets create-scope jdbc

Note

If your account does not have the Premium plan or above, you must create the scope with MANAGE permission granted to all users (“users”). For example:

databricks secrets create-scope jdbc --initial-manage-principal users

Create secrets

Add the secrets username and password. Run the following commands and enter the secret values in the opened editor.

databricks secrets put-secret jdbc username
databricks secrets put-secret jdbc password

Use the secrets in a notebook

In a notebook, read the secrets that are stored in the secret scope jdbc to configure a JDBC connector:

val driverClass = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
val connectionProperties = new java.util.Properties()
connectionProperties.setProperty("Driver", driverClass)

val jdbcUsername = dbutils.secrets.get(scope = "jdbc", key = "username")
val jdbcPassword = dbutils.secrets.get(scope = "jdbc", key = "password")
connectionProperties.put("user", s"${jdbcUsername}")
connectionProperties.put("password", s"${jdbcPassword}")

You can now use these ConnectionProperties with the JDBC connector to talk to your data source. The values fetched from the scope are never displayed in the notebook (see Secret redaction).

Grant access to another group

Note

This step requires that your account have the Premium plan or above.

After verifying that the credentials were configured correctly, share these credentials with the datascience group to use for their analysis by granting them permissions to read the secret scope and list the available secrets .

Grant the datascience group the READ permission to these credentials by making the following request:

databricks secrets put-acl jdbc datascience READ

For more information about secret access control, see Secret ACLs.