Read Databricks tables from Delta clients
This page provides an overview of using the Unity REST API to access Unity Catalog managed and external tables from external Delta clients. To create external Delta tables from external clients, see Create external Delta tables from external clients.
Use the Iceberg REST catalog to read Unity Catalog-registered tables on Databricks from supported Iceberg clients, including Apache Spark and DuckDB.
For a full list of supported integrations, see Unity Catalog integrations.
Read and write using the Unity REST API
The Unity REST API provides external clients read access to tables registered to Unity Catalog. Some clients also support creating tables and writing to existing tables.
To avoid potential data corruption and data loss issues, Databricks recommends you do not modify the same Delta table stored in S3 from different writer clients.
Configure access using the endpoint /api/2.1/unity-catalog.
Requirements
Databricks supports Unity REST API access to tables as part of Unity Catalog. You must have Unity Catalog enabled in your workspace to use these endpoints. The following table types are eligible for Unity REST API reads:
- Unity Catalog managed tables.
- Unity Catalog external tables.
You must complete the following configuration steps to configure access to read Databricks objects from Delta clients using the Unity REST API:
- Enable External data access for your metastore. See Enable external data access on the metastore.
- Grant the principal configuring the integration the
EXTERNAL USE SCHEMAprivilege on the schema containing the objects. See Grant a principal Unity Catalog privileges. - Authenticate using one of the following methods:
- Personal access token (PAT): See Authorize access to Databricks resources.
- OAuth machine-to-machine (M2M) authentication: Supports automatic credential and token refresh for long-running Spark jobs (>1 hour). See Authorize service principal access to Databricks with OAuth.
Read Delta tables with Apache Spark using PAT authentication
The following configuration is required to read Unity Catalog managed and external Delta tables with Apache Spark using PAT authentication:
"spark.sql.extensions": "io.delta.sql.DeltaSparkSessionExtension",
"spark.sql.catalog.spark_catalog": "io.unitycatalog.spark.UCSingleCatalog",
"spark.sql.catalog.<uc-catalog-name>": "io.unitycatalog.spark.UCSingleCatalog",
"spark.sql.catalog.<uc-catalog-name>.uri": "<workspace-url>/api/2.1/unity-catalog",
"spark.sql.catalog.<uc-catalog-name>.token": "<token>",
"spark.sql.defaultCatalog": "<uc-catalog-name>",
"spark.hadoop.fs.s3.impl": "org.apache.hadoop.fs.s3a.S3AFileSystem",
"spark.jars.packages": "io.delta:delta-spark_2.13:4.0.1,io.unitycatalog:unitycatalog-spark_2.13:0.3.1,org.apache.hadoop:hadoop-aws:3.4.0"
Substitute the following variables:
<uc-catalog-name>: The name of the catalog in Unity Catalog that contains your tables.<workspace-url>: URL of the Databricks workspace.<token>: Personal access token (PAT) for the principal configuring the integration.
To enable automatic credential renewal for long-running jobs, add the following configuration:
"spark.sql.catalog.<catalog-name>.renewCredential.enabled": true
The package versions shown above are current as of the last update to this page. Newer versions may be available. Verify that package versions are compatible with your Databricks Runtime version and Spark version.
For additional details about configuring Apache Spark for cloud object storage, see the Unity Catalog OSS documentation.
Read Delta tables with Apache Spark using OAuth authentication
Databricks also supports OAuth machine-to-machine (M2M) authentication. OAuth automatically handles token renewal for Unity Catalog authentication. For long-running jobs that also require automatic cloud storage credential renewal, enable the spark.sql.catalog.<uc-catalog-name>.renewCredential.enabled setting in your Spark configuration.
OAuth authentication for external Spark clients requires:
- Unity Catalog Spark client version 0.3.1 or later (
io.unitycatalog:unitycatalog-spark) - Apache Spark 4.0 or later
- Delta Spark 4.0.1 or later with OAuth support
- An OAuth M2M service principal with appropriate permissions. See Authorize service principal access to Databricks with OAuth.
The following configuration is required to read Unity Catalog managed tables and external Delta tables with Apache Spark using OAuth authentication:
"spark.sql.extensions": "io.delta.sql.DeltaSparkSessionExtension",
"spark.sql.catalog.spark_catalog": "io.unitycatalog.spark.UCSingleCatalog",
"spark.sql.catalog.<uc-catalog-name>": "io.unitycatalog.spark.UCSingleCatalog",
"spark.sql.catalog.<uc-catalog-name>.uri": "<workspace-url>/api/2.1/unity-catalog",
"spark.sql.catalog.<uc-catalog-name>.auth.type": "oauth",
"spark.sql.catalog.<uc-catalog-name>.auth.oauth.uri": "<oauth-token-endpoint>",
"spark.sql.catalog.<uc-catalog-name>.auth.oauth.clientId": "<oauth-client-id>",
"spark.sql.catalog.<uc-catalog-name>.auth.oauth.clientSecret": "<oauth-client-secret>",
"spark.sql.catalog.<uc-catalog-name>.renewCredential.enabled": "true",
"spark.sql.defaultCatalog": "<uc-catalog-name>",
"spark.hadoop.fs.s3.impl": "org.apache.hadoop.fs.s3a.S3AFileSystem",
"spark.jars.packages": "io.delta:delta-spark_2.13:4.0.1,io.unitycatalog:unitycatalog-spark_2.13:0.3.1,org.apache.hadoop:hadoop-aws:3.4.0"
Substitute the following variables:
<uc-catalog-name>: The name of the catalog in Unity Catalog that contains your tables.<workspace-url>: URL of the Databricks workspace. See Workspace instance names, URLs, and IDs.<oauth-token-endpoint>: OAuth token endpoint URL. To construct this URL:- Locate your Databricks account ID. See Locate your account ID.
- Use the format:
https://accounts.cloud.databricks.com/oidc/accounts/<account-id>/v1/token
<oauth-client-id>: OAuth client ID for your service principal. See Authorize service principal access to Databricks with OAuth.<oauth-client-secret>: OAuth client secret for your service principal. See Authorize service principal access to Databricks with OAuth.
The package versions shown above are current as of the last update to this page. Newer versions may be available. Verify that package versions are compatible with your Spark version.