Delta Lake and Delta Engine guide
Delta Lake is an open source storage layer that brings reliability to data lakes. Delta Lake provides ACID transactions, scalable metadata handling, and unifies streaming and batch data processing. Delta Lake runs on top of your existing data lake and is fully compatible with Apache Spark APIs. Delta Lake on Databricks allows you to configure Delta Lake based on your workload patterns.
Databricks also includes Delta Engine, which provides optimized layouts and indexes for fast interactive queries.
This guide covers Delta Lake on Databricks and Delta Engine.
- Introduction
- Delta Lake quickstart
- Introductory notebooks
- Ingest data into Delta Lake
- Table batch reads and writes
- Table streaming reads and writes
- Table deletes, updates, and merges
- Table utility commands
- Constraints
- Table versioning
- Delta Lake API reference
- Concurrency control
- Integrations
- Migration guide
- Best practices: Delta Lake
- Frequently asked questions (FAQ)
- Delta Lake resources
- Delta Engine