Authentication for Databricks automation - overview
In Databricks, authentication refers to verifying a Databricks identity (such as a user, service principal, or group). Databricks uses credentials (such as an access token or a username and password) to verify the identity.
After Databricks verifies the caller’s identity, Databricks then uses a process called authorization to determine whether the verified identity has sufficient access permissions to perform the specified action on the resource at the given location. This article includes details only about authentication. It does not include details about authorization or access permissions; see Authentication and access control.
When a tool makes an automation or API request, it includes credentials that authenticate an identity with Databricks. This article describes typical ways to create, store, and pass credentials and related information that Databricks needs to authenticate and authorize requests. To learn which credential types, related information, and storage mechanism are supported by your tools, SDKs, scripts, and apps, see Supported authentication types by Databricks tool or SDK or your provider’s documentation.
Common tasks for Databricks authentication
Use the following instructions to complete common tasks for Databricks authentication.
To complete this task… |
Follow the instructions in this article |
---|---|
Create a Databricks user that you can use for authenticating at the Databricks account level. |
|
Create a Databricks user that you can use for authenticating with a specific Databricks workspace. |
|
Create a Databricks personal access token for a Databricks user. (This Databricks personal access token can be used only for authenticating with its associated Databricks workspace.) |
|
Create a Databricks service principal, and then create a Databricks personal access token for that Databricks service principal. (This Databricks personal access token can be used only for authenticating with its associated Databricks workspace.) |
Provision a service principal for Databricks automation - Databricks UI See also Databricks personal access tokens for service principals. |
Create a Databricks configuration profile. |
|
Create a Databricks group, and add Databricks users and Databricks service principals to that group, for more robust authorization. |
Manage account groups using the account console, Manage account groups using the workspace admin settings page |
Supported Databricks authentication types
Databricks provides several ways to authenticate Databricks users and service principals, as follows:
Authentication type |
Details |
---|---|
OAuth machine-to-machine (M2M) authentication |
|
OAuth user-to-machine (U2M) authentication |
|
Databricks personal access token authentication |
|
Basic authentication (not recommended in production) |
|
Supported authentication types by Databricks tool or SDK
Databricks tools and SDKs that work with one or more supported Databricks authentication types include the following:
Tool or SDK |
Supported authentication types |
---|---|
Databricks CLI |
For specific Databricks CLI authentication documentation, including how to set up and use Databricks configuration profiles to switch among multiple related authentication settings, see:
For additional technical details about the Databricks CLI, see What is the Databricks CLI?. |
Databricks Terraform provider |
For specific Databricks Terraform provider authentication documentation, including how to store and use
credentials through environment variables,
Databricks configuration profiles, For additional technical details about the Databricks Terraform provider, see Databricks Terraform provider. |
Databricks Connect |
For specific Databricks Connect authentication documentation, see: For additional technical details about Databricks Connect, see What is Databricks Connect?. |
Databricks extension for Visual Studio Code |
For specific Databricks extension for Visual Studio Code authentication documentation, see Authentication setup for the Databricks extension for Visual Studio Code. For additional technical details about the Databricks extension for Visual Studio Code, see What is the Databricks extension for Visual Studio Code?. |
Databricks SDK for Python |
For specific Databricks SDK for Python authentication documentation, see: For additional technical details about the Databricks SDK for Python, see Databricks SDK for Python. |
Databricks SDK for Java |
For specific Databricks SDK for Java authentication documentation, see: For additional technical details about the Databricks SDK for Java, see Databricks SDK for Java. |
Databricks SDK for Go |
For specific Databricks SDK for Java authentication documentation, see: For additional technical details about the Databricks SDK for Go, see Databricks SDK for Go. |
Other Databricks tools and SDKs |
See the tool’s or SDK’s documentation: |
Databricks account and workspace REST APIs
Databricks organizes its Databricks REST API into two categories of APIs: account APIs and workspace APIs. Each of these categories requires different sets of information to authenticate the target Databricks identity. Also, each supported Databricks authentication type requires additional information that uniquely identifies the target Databricks identity.
For instance, to authenticate a Databricks identity for calling Databricks account-level API operations, you must provide:
The target Databricks account console URL, which is typically
https://accounts.cloud.databricks.com
.The target Databricks account ID. See Locate your account ID.
Information that uniquely identifies the target Databricks identity for the target Databricks authentication type. For the specific information to provide, see the section later in this article for that authentication type.
To authenticate a Databricks identity for calling Databricks workspace-level API operations, you must provide:
The target Databricks workspace URL, for example
https://dbc-a1b2345c-d6e7.cloud.databricks.com
.Information that uniquely identifies the target Databricks identity for the target Databricks authentication type. For the specific information to provide, see the section later in this article for that authentication type.
Databricks client unified authentication
Databricks provides a consolidated and consistent architectural and programmatic approach to authentication, known as Databricks client unified authentication. This approach helps make setting up and automating authentication with Databricks more centralized and predictable. It enables you to configure Databricks authentication once and then use that configuration across multiple Databricks tools and SDKs without further authentication configuration changes.
Participating Databricks tools and SDKs include:
The Databricks CLI
All participating tools and SDKs accept special environment variables and Databricks configuration profiles for authentication. The Databricks Terraform provider and the Databricks SDKs for Python, Java, and Go also accept direct configuration of authentication settings within code. For details, see Supported authentication types by Databricks tool or SDK or the tool’s or SDK’s documentation.
Default order of evaluation for client unified authentication methods and credentials
Whenever a participating tool or SDK needs to authenticate with Databricks, the tool or SDK tries the following types of authentication in the following order by default. When the tool or SDK succeeds with the type of authentication that it tries, the tool or SDK stops trying to authenticate with the remaining authentication types. To force an SDK to authenticate with a specific authentication type, set the Config
API’s Databricks authentication type field.
For each authentication type that the participating tool or SDK tries, the tool or SDK tries to find authentication credentials in the following locations, in the following order. When the tool or SDK succeeds in finding authentication credentials that can be used, the tool or SDK stops trying to find authentication credentials in the remaining locations.
Credential-related
Config
API fields (for SDKs). To setConfig
fields, see Supported authentication types by Databricks tool or SDK or the SDK’s reference documentation.Credential-related environment variables. To set environment variables, see Supported authentication types by Databricks tool or SDK and your operating system’s documentation.
Credential-related fields in the
DEFAULT
configuration profile within the.databrickscfg
file. To set configuration profile fields, see Supported authentication types by Databricks tool or SDK and Databricks configuration profiles.
To provide maximum portability for your code, Databricks recommends that you create a custom configuration profile within the .databrickscfg
file, add the required fields for your target Databricks authentication type to the custom configuration profile, and then set the DATABRICKS_CONFIG_PROFILE
environment variable to the name of the custom configuration profile. For more information, see Supported authentication types by Databricks tool or SDK.
Environment variables and fields for client unified authentication
The following tables list the names and descriptions of the supported environment variables and fields for Databricks client unified authentication. In the following tables:
Environment variable, where applicable, is the name of the environment variable. To set environment variables, see Supported authentication types by Databricks tool or SDK and your operating system’s documentation.
.databrickscfg
field, where applicable, is the name of the field within a Databricks configuration profiles file or Databricks Terraform configuration. To set.databrickscfg
fields, see Supported authentication types by Databricks tool or SDK and Databricks configuration profiles.Terraform field, where applicable, is the name of the field within a Databricks Terraform configuration. To set Databricks Terraform fields, see Authentication in the Databricks Terraform provider documentation.
Config
field is the name of the field within theConfig
API for the specified SDK. To use theConfig
API, see Supported authentication types by Databricks tool or SDK or the SDK’s reference documentation.
General host, token, and account ID environment variables and fields
Common name |
Description |
Environment variable |
|
|
---|---|---|---|---|
Databricks host |
(String) The Databricks host URL for either the Databricks workspace endpoint or the Databricks accounts endpoint. |
|
|
|
Databricks token |
(String) The Databricks personal access token. |
|
|
|
Databricks account ID |
(String) The Databricks account ID for the
Databricks account endpoint. Only has effect
when the Databricks host is also set to
|
|
|
|
AWS-specific environment variables and fields
Common name |
Description |
Environment variable |
|
|
---|---|---|---|---|
Databricks username |
(String) The Databricks user’s username. |
|
|
|
Databricks password |
(String) The Databricks user’s password. |
|
|
|
Service principal client ID |
(String) The Databricks service principal’s client ID. |
|
|
|
Service principal secret |
(String) The Databricks service principal’s secret. |
|
|
|
.databrickscfg-specific environment variables and fields
Use these environment variables or fields to specify non-default settings for .databrickscfg
. See also Databricks configuration profiles.
Common name |
Description |
Environment variable |
Terraform field |
|
---|---|---|---|---|
|
(String) A non-default path to the
|
|
|
|
|
(String) The default named profile to
use, other than |
|
|
|
Authentication type field
Use this environment variable or field to force an SDK to use a specific type of Databricks authentication.
Common name |
Description |
Terraform field |
|
---|---|---|---|
Databricks authentication type |
(String) When multiple authentication attributes are available in the environment, use the authentication type specified by this argument. |
|
|
Supported Databricks authentication type field values include:
databricks-cli
: OAuth user-to-machine (U2M) authentication
Databricks configuration profiles
A Databricks configuration profile (sometimes refered to as a configuration profile, a config profile, or simply a profile
) contains settings and other information that Databricks needs to authenticate. Databricks configuration profiles are stored in Databricks configuration profiles files for your tools, SDKs, scripts, and apps to use. To learn whether Databricks configuration profiles are supported by your tools, SDKs, scripts, and apps, see your provider’s documentation. All participating tools and SDKs that implement Databricks client unified authentication support Databricks configuration profiles. For more information, see Supported authentication types by Databricks tool or SDK.
To create a Databricks configuration profiles file:
Use your favorite text editor to create a file named
.databrickscfg
in your~
(your user home) folder on Unix, Linux, or macOS, or your%USERPROFILE%
(your user home) folder on Windows, if you do not already have one. Do not forget the dot (.
) at the beginning of the file name. Add the following contents to this file:[<some-unique-name-for-this-configuration-profile>] <field-name> = <field-value>
In the preceding contents, replace the following values, and then save the file:
<some-unique-name-for-this-configuration-profile>
with a unique name for the configuration profile, such asDEFAULT
,DEVELOPMENT
,PRODUCTION
, or similar. You can have multiple configuration profiles in the same.databrickscfg
file, but each configuration profile must have a unique name within this file.<field-name>
and<field-value>
with the name and a value for one of the required fields for the target Databricks authentication type. For the specific information to provide, see the section earlier in this article for that authentication type.Add a
<field-name>
and<field-value>
pair for each of the additional required fields for the target Databricks authentication type.
For example, for Databricks personal access token authentication, the .databrickscfg
file might look like this:
[DEFAULT]
host = https://dbc-a1b2345c-d6e7.cloud.databricks.com
token = dapi123...
To create additional configuration profiles, specify different profile names within the same .databrickscfg
file. For example, to specify separate Databricks workspaces, each with their own Databricks personal access token:
[DEFAULT]
host = https://dbc-a1b2345c-d6e7.cloud.databricks.com
token = dapi123...
[DEVELOPMENT]
host = https://dbc-b2c3456d-e7f8.cloud.databricks.com
token = dapi234...
You can also specify different profile names within the .databrickscfg
file for Databricks accounts and different Databricks authentication types, for example:
[DEFAULT]
host = https://dbc-a1b2345c-d6e7.cloud.databricks.com
token = dapi123...
[ACCOUNT]
host = https://accounts.cloud.databricks.com
username = someone@example.com
password = MyP25...
account_id = ab0cd1...
ODBC DSNs
In ODBC, a data source name (DSN) is a symbolic name that tools, SDKs, scripts, and apps use to request a connection to an ODBC data source. A DSN stores connection details such as the path to an ODBC driver, networking details, authentication credentials, and database details. To learn whether ODBC DSNs are supported by your tools, scripts, and apps, see your provider’s documentation.
To install and configure the Databricks ODBC Driver and create an ODBC DSN for Databricks, see ODBC driver.
JDBC connection URLs
In JDBC, a connection URL is a symbolic URL that tools, SDKs, scripts, and apps use to request a connection to a JDBC data source. A connection URL stores connection details such as networking details, authentication credentials, database details, and JDBC driver capabilities. To learn whether JDBC connection URLs are supported by your tools, SDKs, scripts, and apps, see your provider’s documentation.
To install and configure the Databricks JDBC Driver and create a JDBC connection URL for Databricks, see JDBC driver.