Skip to main content

Supported connection properties

This article describes the connection properties supported by the Databricks JDBC Driver (OSS).

Authentication and proxy properties

The following connection properties are supported by the Databricks JDBC Driver (OSS). Properties are case insensitive.

Property

Default value

Description

AsyncExecPollInterval

200

The time in milliseconds between each poll for the asynchronous query execution status. Asynchronous refers to the fact that the RPC call used to execute a query against Spark is asynchronous. It does not mean that JDBC asynchronous operations are supported.

Auth_Flow

0

The OAuth2 authentication flow for the driver connection. This property is required if AuthMech is 11.

Auth_JWT_Key_File

null

The path to the private key file (PEM format) for JWT authentication.

Auth_JWT_Alg

RS256

The algorithm for private key JWT authentication. The supported algorithms are: RSA: RS256, RS384, RS512, PS256, PS384, PS512 and EC: ES256, ES384, ES512

Auth_JWT_Key_Passphrase

null

The passphrase for decrypting an encrypted private key.

Auth_KID

null

The Key Identifier (KID) required for JWT authentication. This is mandatory when using private key JWT.

AuthMech

Required

The authentication mechanism, where 3 specifies the mechanism is a Databricks personal access token, and 11 specifies the mechanism is OAuth 2.0 tokens. Additional properties are required for each mechanism. See Authenticate the driver.

CFProxyAuth

0

If set to 1, the driver uses the proxy authentication user and password, represented by CFProxyUID and CFProxyPwd.

CFProxyHost

null

A string that represents the name of the proxy host to use when UseCFProxy is also set to 1.

CFProxyPort

null

An integer that represents the number of the proxy port to use when UseCFProxy is also set to 1.

CFProxyUID

null

A string that represents the username to use for proxy authentication when CFProxyAuth and UseCFProxy are also set to 1.

CFProxyPwd

null

A string that represents the password to use for proxy authentication when CFProxyAuth and UseCFProxy are also set to 1.

ConnCatalog or catalog

SPARK

The name of the default catalog to use.

ConnSchema or schema

default

The name of the default schema to use. This can be specified either by replacing <schema> in the URL with the name of the schema to use or by setting the ConnSchema property to the name of the schema to use.

EnableComplexDatatypeSupport

0

If set to 1, support for complex data types (ARRAYs, STRUCTs, MAPs) as native Java objects instead of strings is enabled.

GoogleServiceAccount

null

Enables authentication using a Google service account.

GoogleCredentialsFile

null

The path to the JSON key file for Google Service account authentication.

EnableOIDCDiscovery

1

If set to 1, the OpenID Connect discovery URL is used.

OIDCDiscoveryEndpoint

null

The OpenID Connect discovery URL for retrieving the OIDC configuration.

Auth_RefreshToken

null

The OAuth2 refresh token used to retrieve a new access token.

OAuth2ConnAuthAuthorizeEndpoint

null

The authorization endpoint URL used in an OAuth2 flow.

OAuth2ConnAuthTokenEndpoint

null

The token endpoint URL for the OAuth2 flow.

ProxyAuth

0

If set to 1, the driver uses the proxy authentication user and password, represented by ProxyUID and ProxyPwd.

ProxyHost

null

A string that represents the name of the proxy host to use when UseProxy is also set to 1.

ProxyPort

null

An integer that represents the number of the proxy port to use when UseProxy is also set to 1.

ProxyPwd

null

A string that represents the password to use for proxy authentication when ProxyAuth and UseProxy are also set to 1.

ProxyUID

null

A string that represents the username to use for proxy authentication when ProxyAuth and UseProxy are also set to 1.

SSL

1

Whether the connector communicates with the Spark server through an SSL-enabled socket.

UseProxy

0

If set to 1, the driver uses the provided proxy settings (for example: ProxyAuth, ProxyHost, ProxyPort, ProxyPwd, and ProxyUID).

UseSystemProxy

0

If set to 1, the driver uses the proxy settings that have been set at the system level. If any additional proxy properties are set in the connection URL, these additional proxy properties override those that have been set at the system level.

UseCFProxy

0

If set to 1, the driver uses the cloud fetch proxy settings if they are provided, otherwise use the regular proxy.

UseJWTAssertion

false

Enables private key JWT authentication for M2M use cases where client secret authentication is restricted.

UserAgentEntry

browser

The User-Agent entry to be included in the HTTP request. This value is in the following format: [ProductName]/[ProductVersion] [Comment]

UseThriftClient

1

Whether the JDBC driver should use the Thrift client or Statement Execution APIs.

SQL configuration properties

The following SQL configuration properties are supported by the Databricks JDBC Driver (OSS). These are also described in Configuration parameters. Properties are case insensitive.

Property

Default value

Description

ansi_mode

TRUE

Whether to enable strict ANSI SQL behavior for certain functions and casting rules.

enable_photon

TRUE

Whether to enable the Photon vectorized query engine.

legacy_time_parser_policy

EXCEPTION

The methods used to parse and format dates and timestamps. Valid values are EXCEPTION, LEGACY, and CORRECTED.

max_file_partition_bytes

128m

The maximum number of bytes to pack into a single partition when reading from file based sources. The setting can be any positive integer and optionally include a measure such as b (bytes), k or kb (1024 bytes).

read_only_external_metastore

false

Controls whether an external metastore is treated as read-only.

statement_timeout

172800

Sets a SQL statement timeout between 0 and 172800 seconds.

timezone

UTC

Set the local timezone. Region IDs in the form area/city, such as America/Los_Angeles or zone offsets in the format (+|-)HH, (+|-)HH:mm or (+|-)HH:mm:ss, e.g -08, +01:00 or -13:33:33. Also, UTC is supported as an alias for +00:00

use_cached_result

true

Whether Databricks SQL caches and reuses results whenever possible.

Logging properties

The following logging properties are supported by the Databricks JDBC Driver (OSS). Properties are case insensitive.

Property

Default value

Description

LogLevel

OFF

The logging level, which is a value 0 through 6:

  • 0: Disable all logging.
  • 1: Enable logging on the FATAL level, which logs very severe error events that will lead the connector to abort.
  • 2: Enable logging on the ERROR level, which logs error events that might still allow the connector to continue running.
  • 3: Enable logging on the WARNING level, which logs events that might result in an error if action is not taken.
  • 4: Enable logging on the INFO level, which logs general information that describes the progress of the connector.
  • 5: Enable logging on the DEBUG level, which logs detailed information that is useful for debugging the connector.
  • 6: Enable logging on the TRACE level, which logs all connector activity.

Use this property to enable or disable logging in the connector and to specify the amount of detail included in log files.

LogPath

To determine the default path for logs, the driver uses the value set for these system properties, in this priority order:

  1. user.dir
  2. java.io.tmpdir
  3. the current directory, in other words .

The full path to the folder where the connector saves log files when logging is enabled, as a string. To ensure that the connection URL is compatible with all JDBC applications, escape the backslashes (\) in your file path by typing another backslash.

If the LogPath value is invalid, the connector sends the logged information to the standard output stream (System.out).

LogFileSize

No maximum

The maximum allowed log file size, specified in MB

LogFileCount

No maximum

The maximum number of allowed log files

Enable and configure logging

The JDBC driver supports the Simple Logging Facade for Java (SLF4J) and java.util.logging (JUL) frameworks. The driver uses the JUL logging framework by default.

To enable and configure logging for the JDBC driver:

  1. Enable the logging framework that you want to use:

    • For SLF4J logging, set the system property -Dcom.databricks.jdbc.loggerImpl=SLF4JLOGGER and provide the SLF4J binding implementation (compatible with SLF4J version 2.0.13 and above) and corresponding configuration file in the classpath.
    • For JUL logging, set the system property -Dcom.databricks.jdbc.loggerImpl=JDKLOGGER. This is the default.
  2. Set the LogLevel property on the connection string to the desired level of information to include in log files.

  3. Set the LogPath property on the connection string to the full path to the folder where you want to save log files.

    For example, the following connection URL enables logging level 6 and saves the log files to the C:temp folder:

    jdbc: databricks://localhost:11000;LogLevel=6;LogPath=C:\\temp
  4. Restart your JDBC application and reconnect to the server to apply the settings.

Volume operations properties

The following Unity Catalog volume operations properties are supported by the Databricks JDBC Driver (OSS). Properties are case insensitive.

Property

Default value

Description

VolumeOperationAllowedLocalPaths

``

The comma separated list of allowed local paths for downloading and uploading of UC Volume Ingestion files. The paths include sub-directories as well.

Manage files using volumes

Databricks offers bulk ingestion capabilities (upload/download/remove) using Unity Catalog volumes, allowing users to transfer datasets to and from local files like CSV files. See What are Unity Catalog volumes?. To enable Unity Catalog volume operations, set the connection property VolumeOperationAllowedLocalPaths to a comma separated list of allowed local paths for the volume operations.

Unity Catalog must be enabled to use this feature. Similar functionality is available using the Databricks UI. See Upload files to a Unity Catalog volume.

The Unity Catalog ingestion commands are SQL statements. The examples below demonstrate PUT, GET, and REMOVE operations.

Upload a local file

To upload a local file /tmp/test.csv into a Unity Catalog volume path as /Volumes/main/default/e2etests/file1.csv, use the PUT operation:

Text
  PUT '/tmp/test.csv' INTO '/Volumes/main/default/e2etests/file1.csv' OVERWRITE

Download a file

To download a file from the Unity Catalog volume path /Volumes/main/default/e2etests/file1.csv into a local file /tmp/test.csv, use the GET operation:

Text
  GET '/Volumes/main/default/e2etests/file1.csv' TO '/tmp/test.csv'

Delete a file

To delete a file with a Unity Catalog volume path /Volumes/main/default/e2etests/file1.csv, use the REMOVE operation:

Text
  REMOVE '/Volumes/main/default/e2etests/file1.csv'