Unity Catalog public preview limitations

Preview

Unity Catalog is in Public Preview. To participate in the preview, contact your Databricks representative.

During the public preview, Unity Catalog has the following limitations:

  • Python, Scala, and R workloads are supported only on clusters that use the Single User security mode. Workloads in these languages do not support the use of dynamic views for row-level or column-level security.

  • Unity Catalog can be used together with the built-in Hive metastore provided by Databricks. You can’t use Unity Catalog with external Hive metastores that require configuration using init scripts.

  • Unity Catalog does not manage partitions like Hive does. Unity Catalog managed tables are, by definition, Delta tables, for which partition metadata is tracked in the Delta log. External non-Delta tables may be partitioned in storage; Unity Catalog does not manage the partitions for such tables, which means that you cannot partition objects for these tables from Unity Catalog.

  • Overwrite mode for DataFrame write operations into Unity Catalog is supported only for managed Delta tables and not for other cases, such as external tables. In addition, the user must have the CREATE privilege on the parent schema and must be the owner of the existing object.

  • Support for streaming workloads against Unity Catalog is in Private Preview, and there is no support for:

    • Using external locations as the source or destination for streams.

    • Storing streaming metadata in external locations. This metadata includes streaming checkpoints and schema location for the cloud_files source.

    • Using Python or R to write streaming queries. Only Scala and Java are supported in Private Preview.

  • The following Delta Lake features aren’t supported: