Using Auto Loader with Unity Catalog

Auto Loader can securely ingest data from external locations configured with Unity Catalog. To learn more about securely connecting storage with Unity Catalog, see Manage external locations and storage credentials. Auto Loader relies on Structured Streaming for incremental processing; for recommendations and limitations see Using Unity Catalog with Structured Streaming.


In Databricks Runtime 11.3 LTS and above, you can use Auto Loader with either shared or Single User access modes. In Databricks Runtime 11.2, you can only use Single User access mode.

Ingesting data from external locations managed by Unity Catalog with Auto Loader

You can use Auto Loader to ingest data from any external location managed by Unity Catalog. You must have READ FILES permissions on the external location.


Unity Catalog external locations do not support cross-cloud or cross-account configurations for Auto Loader.

Directory Listing mode is supported by default. To use File Notification mode, you must configure additional cloud credentials to connect to file notification and queue services; see Compare Auto Loader file detection modes.

Specifying locations for Auto Loader resources for Unity Catalog

The Unity Catalog security model assumes that all storage locations referenced in a workload will be managed by Unity Catalog. Databricks recommends always storing checkpoint and schema evolution information in storage locations managed by Unity Catalog. Unity Catalog does not allow you to nest checkpoint or schema inference and evolution files under the table directory.


The follow examples assume the executing user has owner privileges on the target tables and the following configurations and grants:

Storage location






Using Auto Loader to load to a Unity Catalog managed table

checkpoint_path = "s3://dev-bucket/_checkpoint/dev_table"

  .option("cloudFiles.format", "json")
  .option("cloudFiles.schemaLocation", checkpoint_path)
  .option("checkpointLocation", checkpoint_path)