What happened to Delta Live Tables (DLT)?
The product formerly known as Delta Live Tables (DLT) has been updated to Lakeflow Spark Declarative Pipelines (SDP). If you have previously used DLT, there is no migration required to use Lakeflow Spark Declarative Pipelines: your code will still work in SDP. There are changes that you can make to better take advantage of Lakeflow Spark Declarative Pipelines, both now and in the future, as well as to introduce compatibility with the Apache Spark™ Declarative Pipelines (beginning in Apache Spark 4.1).
In Python code, references to import dlt can be replaced with from pyspark import pipelines as dp, which also requires the following changes:
@dltis replaced with@dp.- The
@tabledecorator is now used to create streaming tables, and the new@materialized_viewdecorator is used to create materialized views. @viewis now@temporary_view.
For more details on the Python API name changes, and differences between Lakeflow SDP and Apache Spark Declarative Pipelines, see What happened to @dlt? in the pipelines Python reference.
There are still some references to the DLT name in Databricks. The classic SKUs for Lakeflow Spark Declarative Pipelines still begin with DLT, and event log schemas with dlt in the name have not changed. Python APIs that used dlt in the name can still be used, but Databricks recommends moving to the new names.