Command Palette commands for the Databricks extension for Visual Studio Code
This article lists Command Palette commands for the Databricks extension for Visual Studio Code. See What is the Databricks extension for Visual Studio Code?.
Command reference
The Databricks extension for Visual Studio Code adds the following commands to the Visual Studio Code Command Palette. See also Command Palette in the Visual Studio Code documentation.
Command |
Description |
---|---|
|
Change the Python virtual environment |
|
Enables IntelliSense in the Visual Studio Code code editor for PySpark, Databricks Utilities, and related globals such as |
|
Moves focus to the Command Palette to create, select, or change the Databricks cluster to use for the current project. |
|
Debug the current file with Databricks Connect. |
|
Deploy the bundle to the Databricks workspace. |
|
Destroy the bundle. |
|
Removes the reference to the Databricks cluster from the current project. |
|
Moves focus in the Databricks panel to the Bundle Resource Explorer view. |
|
Moves focus in the Databricks panel to the Bundle Variables View. |
|
Moves focus in the Databricks panel to the Configuration view. |
|
Ignore warnings and force a deploy of the bundle to the Databricks workspace. |
|
Ignore warnings and force a destroy the bundle. |
|
Initialize the creation workflow for a new Databricks project in Visual Studio Code. |
|
Resets the Databricks panel to show the Configure Databricks and Show Quickstart buttons in the Configuration view. Any content in the current project’s |
|
Opens the Databricks configuration profiles file, from the default location, for the current project. See Configure your Databricks project using the Databricks extension for Visual Studio Code. |
|
Opens the folder that contains the application log files that the Databricks extension for Visual Studio Code writes to your development machine. |
|
Overrides bundle variables. |
|
Refreshes the Python Environment view in the Databricks panel. |
|
Refreshes the bundle configuration for the target. |
|
Reinstall Databricks Connect. See Debug code using Databricks Connect for the Databricks extension for Visual Studio Code. |
|
Resets bundle variable values. |
|
Runs current file with Databricks Connect. See Debug code using Databricks Connect for the Databricks extension for Visual Studio Code. |
|
Select a Databricks Asset Bundle target. See Change the target deployment workspace. |
|
Shows bundle logs. See View Databricks log output. |
|
Shows the Quickstart file in the editor. |
|
Sign in to a workspace. |
|
Starts the cluster if it is already stopped. |
|
Starts synchronizing the current project’s code to the Databricks workspace. This command performs an incremental synchronization. |
|
Starts synchronizing the current project’s code to the Databricks workspace. This command performs a full synchronization, even if an incremental sync is possible. |
|
Stops the cluster if it is already running. |
|
Stops synchronizing the current project’s code to the Databricks workspace. |
|
Verify Databricks notebook init scripts. |
PySpark and Databricks Utilities code completion
This article describes how to enable PySpark and Databricks Utilities code completion for the Databricks extension for Visual Studio Code. See What is the Databricks extension for Visual Studio Code?
This information assumes that you have already installed and set up the Databricks extension for Visual Studio Code. See Install the Databricks extension for Visual Studio Code.
To enable IntelliSense (also known as code completion) in the Visual Studio Code code editor for PySpark, Databricks Utilities, and related globals such as spark
and dbutils
, do the following with your project opened:
On the Command Palette (View > Command Palette), type
>Databricks: Configure autocomplete for Databricks globals
and press Enter.Follow the on-screen prompts to allow the Databricks extension for Visual Studio Code to install PySpark for your project, and to add or modify the
__builtins__.pyi
file for your project to enable Databricks Utilities.
You can now use globals such as spark
and dbutils
in your code without declaring any related import
statements beforehand.