Databricks CLI commands
Note
This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is in Public Preview.
Databricks CLI use is subject to the Databricks License and Databricks Privacy Notice, including any Usage Data provisions.
This article provides information about available Databricks CLI commands. This information supplements the command line help. For more information about installing and using the Databricks CLI, see Install or update the Databricks CLI and What is the Databricks CLI?.
The Databricks CLI includes the command groups listed in the following tables. Command groups contain sets of related commands, which can also contain subcommands. To output usage and syntax information for a command group, an individual command, or subcommand:
databricks <command-group> -h
databricks <command-group> <command-name> -h
databricks <command-group> <command-name> <subcommand-name> -h
Many CLI commands map to operations that are documented in the Databricks REST API reference.
Workspace commands
Group |
Description and commands |
---|---|
Commands for managing files and the file system:
|
|
|
Commands for registering personal access tokens for Databricks to do operations on behalf of the user:
|
|
Commands for allowing users to manage their git repos:
|
|
Commands for managing secrets, secret scopes, and access permissions:
|
|
Commands to list, import, export, and delete notebooks and folders in the workspace:
|
Compute commands
Group |
Description and commands |
---|---|
|
Commands to control users’ ability to configure clusters based on a set of rules:
|
|
Commands that allow you to create, start, edit, list, terminate, and delete clusters:
|
|
Commands that enable workspace administrators to configure global initialization scripts for their workspace:
|
|
Commands to create, edit, delete, and list instance pools using ready-to-use cloud instances which reduces a cluster start and auto-scaling times:
|
|
Commands to allow admins to add, list, and remove instance profiles that users can launch clusters with:
|
|
Commands to install, uninstall, and get the status of libraries on a cluster:
|
|
Commands to view available policy families:
|
Jobs commands
Group |
Description and commands |
---|---|
|
Commands to manage jobs:
|
Delta Live Tables commands
Group |
Description and commands |
---|---|
|
Commands to create, edit, delete, start, and view details about pipelines:
|
Machine Learning commands
Group |
Description and commands |
---|---|
|
Commands for interacting with experiments, which are the primary unit of organization in MLflow; all MLflow runs belong to an experiment:
|
|
Commands for the Workspace Model Registry:
|
Real-time serving commands
Group |
Description and commands |
---|---|
|
Commands to create, update, and delete model serving endpoints:
|
Identity and access management commands
Group |
Description and commands |
---|---|
|
Commands for managing Databricks accounts:
|
Commands for authentication:
|
|
|
Commands to retrieve information about currently authenticated user or service principal:
|
|
Commands for groups that simplify identity management, making it easier to assign access to Databricks workspace, data, and other securable objects:
|
|
Commands to create read, write, edit, update and manage access for various users on different objects and endpoints:
|
|
Commands for identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms:
|
|
Commands for user identities recognized by Databricks and represented by email addresses:
|
Unity Catalog commands
Group |
Description and commands |
---|---|
|
Commands to manage artifact allow lists. In Databricks Runtime 13.3 and above, you can add libraries and init scripts to the allowlist in UC so that users can leverage these artifacts on compute configured with shared access mode:
|
|
Commands to manage catalogs, the first layer of Unity Catalog’s three-level namespace:
|
|
Commands to create a connection to an external data source:
|
|
Commands to manage external locations, which combine a cloud storage path with a storage credential that authorizes access to the cloud storage path:
|
|
Commands to manage User-Defined Functions (UDFs) in Unity Catalog:
|
|
Commands to grant access to data in Unity Catalog:
|
|
Commands to manage metastores, which are the top-level container of objects in Unity Catalog:
|
|
Commands to manage model versions. Databricks provides a hosted version of MLflow Model Registry in Unity Catalog.
|
|
Commands to manage online tables, which provide lower latency and higher QPS access to data from Delta tables:
|
|
Commands to manage monitors, which compute and monitor data or model quality metrics for a table over time:
|
|
Commands to manage registered models. Databricks provides a hosted version of MLflow Model Registry in Unity Catalog.
|
|
Commands to manage schemas, which are the second layer of Unity Catalog’s three-level namespace:
|
|
Commands to manage storage credentials, which are an authentication and authorization mechanism for accessing data stored on your cloud tenant:
|
|
Commands to manage system schemas, which are schemas that live within the system catalog:
|
|
Commands to manage primary key and foreign key constraints that encode relationships between fields in tables:
|
|
Commands to manage tables, which resides in the third layer of Unity Catalog’s three-level namespace:
|
|
Commands to manage volumes, which are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files:
|
|
Commands to manage securable workspace bindings. Securables in Databricks can be configured as
|
Delta sharing commands
Group |
Description and commands |
---|---|
|
Commands to manage data providers, which represent the organizations who share data:
|
|
Commands to manage recipient activation, which is only applicable in the open sharing model where the recipient object has the TOKEN authentication type:
|
|
Commands to manage recipients, which you create using
|
|
Commands to manage shares, which are containers instantiated with
|
Settings commands
Group |
Description and commands |
---|---|
|
Commands to enable admins to configure IP access lists:
|
|
Commands to allow users to manage settings at the workspace level:
|
|
Commands that enable administrators to get all tokens and delete tokens for other users:
|
|
Commands to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs:
|
|
Commands to update workspace settings:
|
Developer tools commands
Group |
Description and commands |
---|---|
Commands to manage Databricks Asset Bundles, which let you express your Databricks projects as code:
|
|
Synchronize a local directory to a workspace directory. |
Vector search commands
Group |
Description and commands |
---|---|
|
Commands to manage vector search endpoints, which represent the compute resources to host vector search indexes:
|
|
Commands to manage vector search indexes, an efficient representation of your embedding vectors that supports real-time and efficient approximate nearest neighbor (ANN) search queries:
|
Dashboard commands
Group |
Description and commands |
---|---|
|
Commands for modifying dashboards:
|
|
Commands that provide specific management operations for AI/BI dashboards:
|
Additional commands
Group |
Description and commands |
---|---|
Commands to make requests to the Databricks REST API:
|
|
Commands to generate the autocompletion script for the specified shell:
|
|
Configure the Databricks CLI. |
|
|
Output usage information for any command. |
Commands to manage Databricks Labs installations:
|
|
|
Retrieve the version of the CLI currently being used. |
Global flags
The following flags are available to all Databricks CLI commands. Note that some flags do not apply to some commands. For detailed information about specific commands and their flags, see the command-line help.
Flag |
Description |
---|---|
|
Display help for the Databricks CLI or the related command group or the related command. |
|
A string representing the bundle environment to use if applicable for the related command. |
|
A string representing the to write output logs to. If this flag is not specified then the default is to write output logs to stderr. |
|
|
|
A string representing the log format level. If not specified then the log format level is disabled. |
|
|
|
A string representing the named configuration profile to use within your |
|
The format for progress logs to display ( |