Supported connection properties
This article describes the connection properties supported by the Databricks JDBC Driver (OSS).
Authentication and proxy properties
The following connection properties are supported by the Databricks JDBC Driver (OSS). Properties are case insensitive.
Property | Default value | Description |
---|---|---|
|
| The time in milliseconds between each poll for the asynchronous query execution status. Asynchronous refers to the fact that the RPC call used to execute a query against Spark is asynchronous. It does not mean that JDBC asynchronous operations are supported. |
|
| The OAuth2 authentication flow for the driver connection. This property is required if |
|
| The path to the private key file (PEM format) for JWT authentication. |
|
| The algorithm for private key JWT authentication. The supported algorithms are: RSA: RS256, RS384, RS512, PS256, PS384, PS512 and EC: ES256, ES384, ES512 |
|
| The passphrase for decrypting an encrypted private key. |
|
| The Key Identifier (KID) required for JWT authentication. This is mandatory when using private key JWT. |
| Required | The authentication mechanism, where |
|
| If set to |
|
| A string that represents the name of the proxy host to use when |
|
| An integer that represents the number of the proxy port to use when |
|
| A string that represents the username to use for proxy authentication when |
|
| A string that represents the password to use for proxy authentication when |
|
| The name of the default catalog to use. |
|
| The name of the default schema to use. This can be specified either by replacing |
|
| If set to |
|
| Enables authentication using a Google service account. |
|
| The path to the JSON key file for Google Service account authentication. |
|
| If set to |
|
| The OpenID Connect discovery URL for retrieving the OIDC configuration. |
|
| The OAuth2 refresh token used to retrieve a new access token. |
|
| The authorization endpoint URL used in an OAuth2 flow. |
|
| The token endpoint URL for the OAuth2 flow. |
|
| If set to |
|
| A string that represents the name of the proxy host to use when |
|
| An integer that represents the number of the proxy port to use when |
|
| A string that represents the password to use for proxy authentication when |
|
| A string that represents the username to use for proxy authentication when |
|
| Whether the connector communicates with the Spark server through an SSL-enabled socket. |
|
| If set to |
|
| If set to |
|
| If set to |
|
| Enables private key JWT authentication for M2M use cases where client secret authentication is restricted. |
|
| The User-Agent entry to be included in the HTTP request. This value is in the following format: |
|
| Whether the JDBC driver should use the Thrift client or Statement Execution APIs. |
SQL configuration properties
The following SQL configuration properties are supported by the Databricks JDBC Driver (OSS). These are also described in Configuration parameters. Properties are case insensitive.
Property | Default value | Description |
---|---|---|
|
| Whether to enable strict ANSI SQL behavior for certain functions and casting rules. |
|
| Whether to enable the Photon vectorized query engine. |
|
| The methods used to parse and format dates and timestamps. Valid values are |
|
| The maximum number of bytes to pack into a single partition when reading from file based sources. The setting can be any positive integer and optionally include a measure such as |
|
| Controls whether an external metastore is treated as read-only. |
|
| Sets a SQL statement timeout between 0 and 172800 seconds. |
|
| Set the local timezone. Region IDs in the form |
|
| Whether Databricks SQL caches and reuses results whenever possible. |
Logging properties
The following logging properties are supported by the Databricks JDBC Driver (OSS). Properties are case insensitive.
Property | Default value | Description |
---|---|---|
|
| The logging level, which is a value 0 through 6:
Use this property to enable or disable logging in the connector and to specify the amount of detail included in log files. |
| To determine the default path for logs, the driver uses the value set for these system properties, in this priority order:
| The full path to the folder where the connector saves log files when logging is enabled, as a string. To ensure that the connection URL is compatible with all JDBC applications, escape the backslashes ( If the |
| No maximum | The maximum allowed log file size, specified in MB |
| No maximum | The maximum number of allowed log files |
Enable and configure logging
The JDBC driver supports the Simple Logging Facade for Java (SLF4J) and java.util.logging (JUL) frameworks. The driver uses the JUL logging framework by default.
To enable and configure logging for the JDBC driver:
-
Enable the logging framework that you want to use:
- For SLF4J logging, set the system property
-Dcom.databricks.jdbc.loggerImpl=SLF4JLOGGER
and provide the SLF4J binding implementation (compatible with SLF4J version 2.0.13 and above) and corresponding configuration file in the classpath. - For JUL logging, set the system property
-Dcom.databricks.jdbc.loggerImpl=JDKLOGGER
. This is the default.
- For SLF4J logging, set the system property
-
Set the
LogLevel
property on the connection string to the desired level of information to include in log files. -
Set the
LogPath
property on the connection string to the full path to the folder where you want to save log files.For example, the following connection URL enables logging level 6 and saves the log files to the C:temp folder:
jdbc: databricks://localhost:11000;LogLevel=6;LogPath=C:\\temp
-
Restart your JDBC application and reconnect to the server to apply the settings.
Volume operations properties
The following Unity Catalog volume operations properties are supported by the Databricks JDBC Driver (OSS). Properties are case insensitive.
Property | Default value | Description |
---|---|---|
| `` | The comma separated list of allowed local paths for downloading and uploading of UC Volume Ingestion files. The paths include sub-directories as well. |
Manage files using volumes
Databricks offers bulk ingestion capabilities (upload/download/remove) using Unity Catalog volumes, allowing users to transfer datasets to and from local files like CSV files. See What are Unity Catalog volumes?. To enable Unity Catalog volume operations, set the connection property VolumeOperationAllowedLocalPaths
to a comma separated list of allowed local paths for the volume operations.
Unity Catalog must be enabled to use this feature. Similar functionality is available using the Databricks UI. See Upload files to a Unity Catalog volume.
The Unity Catalog ingestion commands are SQL statements. The examples below demonstrate PUT, GET, and REMOVE operations.
Upload a local file
To upload a local file /tmp/test.csv
into a Unity Catalog volume path as /Volumes/main/default/e2etests/file1.csv
, use the PUT operation:
PUT '/tmp/test.csv' INTO '/Volumes/main/default/e2etests/file1.csv' OVERWRITE
Download a file
To download a file from the Unity Catalog volume path /Volumes/main/default/e2etests/file1.csv
into a local file /tmp/test.csv
, use the GET operation:
GET '/Volumes/main/default/e2etests/file1.csv' TO '/tmp/test.csv'
Delete a file
To delete a file with a Unity Catalog volume path /Volumes/main/default/e2etests/file1.csv
, use the REMOVE operation:
REMOVE '/Volumes/main/default/e2etests/file1.csv'