Unity Catalog requirements and limitations
This page describes the compute requirements, supported file formats, naming constraints, and known limitations for Unity Catalog.
Region support
All regions support Unity Catalog. For details, see Databricks clouds and regions.
Compute requirements
Unity Catalog is supported on clusters that run Databricks Runtime 11.3 LTS or above. Unity Catalog is supported by default on all SQL warehouse compute versions.
Clusters running on earlier versions of Databricks Runtime do not provide support for all Unity Catalog GA features and functionality.
To access data in Unity Catalog, clusters must be configured with the correct access mode. Unity Catalog is secure by default. If a cluster is not configured with standard or dedicated access mode, the cluster can't access data in Unity Catalog. See Access modes.
For detailed information about Unity Catalog functionality changes in each Databricks Runtime version, see the release notes.
Limitations
Unity Catalog has the following limitations. Some of these are specific to older Databricks Runtime versions and compute access modes.
Structured Streaming workloads have additional limitations, depending on Databricks Runtime and access mode. See Standard compute requirements and limitations and Dedicated compute requirements and limitations.
Databricks releases new functionality that shrinks this list regularly.
- Groups that were previously created in a workspace (that is, workspace-level groups) cannot be used in Unity Catalog
GRANTstatements. This is to ensure a consistent view of groups that can span across workspaces. To use groups inGRANTstatements, create your groups at the account level and update any automation for principal or group management (such as SCIM, Okta and Microsoft Entra ID connectors, and Terraform) to reference account endpoints instead of workspace endpoints. See Group sources. - Workloads in R do not support the use of dynamic views for row-level or column-level security on compute running Databricks Runtime 15.3 and below.
- Use a dedicated compute resource running Databricks Runtime 15.4 LTS or above for workloads in R that query dynamic views. Such workloads also require a workspace that is enabled for serverless compute. For details, see Fine-grained access control on dedicated compute.
- A managed table can be shallow cloned to another managed table on Databricks Runtime 13.3 LTS and above. An external table can be shallow cloned to another external table on Databricks Runtime 14.2 and above. A managed table cannot be shallow cloned to an external table. Also, an external table cannot be shallow cloned to a managed table. For more information, see Shallow clone for Unity Catalog tables.
- Bucketing is not supported for Unity Catalog tables. If you run commands that try to create a bucketed table in Unity Catalog, it will throw an exception.
- Writing to the same path or Delta Lake table from workspaces in multiple regions can lead to unreliable performance if some clusters access Unity Catalog and others do not.
- Manipulating partitions for external tables using commands like
ALTER TABLE ADD PARTITIONrequires partition metadata logging to be enabled. See Partition discovery for external tables. - When using overwrite mode for tables not in Delta format, the user must have the CREATE TABLE privilege on the parent schema and must be the owner of the existing object OR have the MODIFY privilege on the object.
- Python UDFs are not supported in Databricks Runtime 12.2 LTS and below. This includes UDAFs, UDTFs, and Pandas on Spark (
applyInPandasandmapInPandas). Python scalar UDFs are supported in Databricks Runtime 13.3 LTS and above. - Scala UDFs are not supported in Databricks Runtime 14.1 and below on compute with standard access mode. Scalar UDFs are supported in Databricks Runtime 14.2 and above on compute with standard access mode.
- Standard Scala thread pools are not supported. Instead, use the special thread pools in
org.apache.spark.util.ThreadUtils, for example,org.apache.spark.util.ThreadUtils.newDaemonFixedThreadPool. However, the following thread pools inThreadUtilsare not supported:ThreadUtils.newForkJoinPooland anyScheduledExecutorServicethread pool.
Models registered in Unity Catalog have additional limitations. See Limitations.
Resource quotas
Unity Catalog enforces resource quotas on all securable objects. These quotas are listed in Resource limits. If you expect to exceed these resource limits, contact your Databricks account team.
You can monitor your quota usage using the Unity Catalog resource quotas APIs. See Monitor your usage of Unity Catalog resource quotas.