Develop Delta Live Tables pipeline code in your local development environment

You can author Python pipeline source code in your preferred integrated development environment (IDE).

You cannot validate or run updates on Delta Live Tables code written in an IDE. You must deploy source code files back to a Databricks workspace and configure them as part of a Delta Live Tables pipeline.

This article provides an overview of support for local IDE development. For more interactive development and testing, Databricks recommends using notebooks. See Develop and debug Delta Live Tables pipelines in notebooks.

Configure a local IDE for pipeline development

Databricks provides a Python module for local development distributed through PyPI. For installation and usage instructions, see Python stub for Delta Live Tables.

This module has the interfaces and docstring references for the Delta Live Tables Python interface, providing syntax checking, autocomplete, and data type checking as you write code in your IDE.

This module includes interfaces but no functional implementations. You cannot use this library to create or run a Delta Live Tables pipeline locally.

You can use Databricks Asset Bundles to package and deploy source code and configurations to a target workspace, and to trigger running an update on a pipeline configured this way. See Convert a Delta Live Tables pipeline into a Databricks Asset Bundle project.

The Databricks extension for Visual Studio Code has additional functionality for working with pipelines using Databricks Asset Bundles. See Bundle Resource Explorer.

Sync pipeline code from your IDE to a workspace

The following table summarizes options for syncing pipeline source code between your local IDE and a Databricks workspace:

Tool or pattern

Details

Databricks Asset Bundles

Use Databricks Asset Bundles to deploy pipeline assets ranging in complexity from a single source code file to configurations for multiple pipelines, jobs, and source code files. See Convert a Delta Live Tables pipeline into a Databricks Asset Bundle project.

Databricks extension for Visual Studio Code

Databricks provides an integration with Visual Studio Code that includes easy syncing between your local IDE and workspace files. This extension also provides tools for using Databricks Asset Bundles to deploy pipelines assets. See What is the Databricks extension for Visual Studio Code?.

Workspace files

You can use Databricks workspace files to upload your pipeline source code to your Databricks workspace and then import that code into a pipeline. See What are workspace files?.

Git folders

Git folders let you sync code between your local environment and Databricks workspace using a Git repository as the intermediary. See Git integration for Databricks Git folders.