Ingest data into Delta Lake

Preview

This feature is in Public Preview.

Databricks offers a variety of ways to help you ingest data into Delta Lake.

Partner integrations

Partner data integrations enable you to load data into Databricks from partner product UIs. This enables low-code, easy-to-implement, and scalable data ingestion from a variety of sources into Databricks. For details, see Partner data integrations.

COPY INTO SQL command

The COPY INTO SQL command lets you load data from a file location into a Delta table. This is a re-triable and idempotent operation—files in the source location that have already been loaded are skipped. For details, see Copy Into (Delta Lake on Databricks).

Auto Loader

Auto Loader incrementally and efficiently processes new data files as they arrive in blob storage without any additional setup. Auto Loader provides a new Structured Streaming source called cloudFiles. Given an input directory path on the cloud file storage, the cloudFiles source automatically sets up file notification services that subscribe to file events from the input directory and processes new files as they arrive, with the option of also processing existing files in that directory. For details, see Load files from S3 using Auto Loader.