Databricks CLI commands
This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is in Public Preview.
Databricks CLI use is subject to the Databricks License and Databricks Privacy Notice, including any Usage Data provisions.
This article provides information about available Databricks CLI commands. This information supplements the command line help. For more information about installing and using the Databricks CLI, see Install or update the Databricks CLI and What is the Databricks CLI?.
The Databricks CLI includes the command groups listed in the following tables. Command groups contain sets of related commands, which can also contain subcommands. To output usage and syntax information for a command group, an individual command, or subcommand:
databricks <command-group> -h
databricks <command-group> <command-name> -h
databricks <command-group> <command-name> <subcommand-name> -h
Many CLI commands map to operations that are documented in the Databricks REST API reference.
Workspace commands
Group | Description and commands |
---|---|
Commands for managing files and the file system:
| |
| Commands for registering personal access tokens for Databricks to do operations on behalf of the user:
|
| Commands for allowing users to manage their git repos:
|
| Commands for managing secrets, secret scopes, and access permissions:
|
| Commands to list, import, export, and delete notebooks and folders in the workspace:
|
Compute commands
Group | Description and commands |
---|---|
| Commands to control users' ability to configure clusters based on a set of rules:
|
| Commands that allow you to create, start, edit, list, terminate, and delete clusters:
|
| Commands that enable workspace administrators to configure global initialization scripts for their workspace:
|
| Commands to create, edit, delete, and list instance pools using ready-to-use cloud instances which reduces a cluster start and auto-scaling times:
|
| Commands to allow admins to add, list, and remove instance profiles that users can launch clusters with:
|
| Commands to install, uninstall, and get the status of libraries on a cluster:
|
| Commands to view available policy families:
|
Jobs commands
Group | Description and commands |
---|---|
| Commands to manage jobs:
|
DLT commands
Group | Description and commands |
---|---|
| Commands to create, edit, delete, start, and view details about pipelines:
|
Machine Learning commands
Group | Description and commands |
---|---|
| Commands for interacting with experiments, which are the primary unit of organization in MLflow; all MLflow runs belong to an experiment:
|
| Commands for the Workspace Model Registry:
|
Real-time serving commands
Group | Description and commands |
---|---|
| Commands to create, update, and delete model serving endpoints:
|
Identity and access management commands
Group | Description and commands |
---|---|
| Commands for managing Databricks accounts:
|
Commands for authentication:
| |
| Commands to retrieve information about currently authenticated user or service principal:
|
| Commands for groups that simplify identity management, making it easier to assign access to Databricks workspace, data, and other securable objects:
|
| Commands to create read, write, edit, update and manage access for various users on different objects and endpoints:
|
| Commands for identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms:
|
| Commands for user identities recognized by Databricks and represented by email addresses:
|
SQL-related commands
Group | Description and commands |
---|---|
| Commands to perform operations on alerts:
|
| Commands to perform operations on legacy alerts:
|
| Commands for making new query objects:
|
| Commands to perform operations on query definitions:
|
| Commands to perform operations on legacy query definitions:
|
| Commands to access the history of queries through SQL warehouses:
|
| Commands to manage SQL warehouses, which are a compute resource that lets you run SQL commands on data objects within Databricks SQL:
|
Unity Catalog commands
Group | Description and commands |
---|---|
| Commands to manage artifact allow lists. In Databricks Runtime 13.3 and above, you can add libraries and init scripts to the allowlist in UC so that users can leverage these artifacts on compute configured with standard access mode:
|
| Commands to manage catalogs, the first layer of Unity Catalog's three-level namespace:
|
| Commands to manage credentials, which are the authentication and authorization mechanism for accessing services on your cloud tenant. Each credential is subject to Unity Catalog access-control policies that control which users and groups can access the credential.
|
| Commands to create a connection to an external data source:
|
| Commands to manage external locations, which combine a cloud storage path with a storage credential that authorizes access to the cloud storage path:
|
| Commands to manage user-defined functions (UDFs) in Unity Catalog:
|
| Commands to grant access to data in Unity Catalog:
|
| Commands to manage metastores, which are the top-level container of objects in Unity Catalog:
|
| Commands to manage model versions. Databricks provides a hosted version of MLflow Model Registry in Unity Catalog.
|
| Commands to manage online tables, which provide lower latency and higher QPS access to data from Delta tables:
|
| Commands to manage monitors, which compute and monitor data or model quality metrics for a table over time:
|
| Commands to manage registered models. Databricks provides a hosted version of MLflow Model Registry in Unity Catalog.
|
| Commands to manage resource quotas enforced by Unity Catalog on all securable objects, which limit the number of resources that can be created.
|
| Commands to manage schemas, which are the second layer of Unity Catalog's three-level namespace:
|
| Commands to manage storage credentials, which are an authentication and authorization mechanism for accessing data stored on your cloud tenant:
|
| Commands to manage system schemas, which are schemas that live within the system catalog:
|
| Commands to manage primary key and foreign key constraints that encode relationships between fields in tables:
|
| Commands to manage tables, which resides in the third layer of Unity Catalog's three-level namespace:
|
| Commands to manage temporary table credentials, which are short-lived, downscoped credentials used to access cloud storage locations where table data is stored in Databricks.
|
| Commands to manage volumes, which are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files:
|
| Commands to manage securable workspace bindings. Securables in Databricks can be configured as
|
Delta sharing commands
Group | Description and commands |
---|---|
| Commands to manage data providers, which represent the organizations who share data:
|
| Commands to manage recipient activation, which is only applicable in the open sharing model where the recipient object has the TOKEN authentication type:
|
| Commands to manage recipients, which you create using
|
| Commands to manage shares, which are containers instantiated with
|
Settings commands
Group | Description and commands |
---|---|
| Commands to enable admins to configure IP access lists:
|
| Commands to manage a workspace's notification destinations:
|
| Commands to allow users to manage settings at the workspace level:
|
| Commands that enable administrators to get all tokens and delete tokens for other users:
|
| Commands to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs:
|
| Commands to update workspace settings:
|
Developer tools commands
Group | Description and commands |
---|---|
Commands to manage Databricks Asset Bundles, which let you express your Databricks projects as code:
| |
Synchronize a local directory to a workspace directory. |
Vector search commands
Group | Description and commands |
---|---|
| Commands to manage vector search endpoints, which represent the compute resources to host vector search indexes:
|
| Commands to manage vector search indexes, an efficient representation of your embedding vectors that supports real-time and efficient approximate nearest neighbor (ANN) search queries:
|
Dashboard commands
Group | Description and commands |
---|---|
| Commands for modifying legacy dashboards:
|
| Commands for modifying Genie, which provides a no-code experience for business users, powered by AI/BI:
|
| Commands that provide specific management operations for AI/BI dashboards:
|
Marketplace commands
Group | Description and commands |
---|---|
| Commands for managing fulfillments, which are entities that allow consumers to preview installations.
|
| Commands for managing installations, which are entities that allow consumers to interact with Databricks Marketplace listings.
|
| Commands for managing Databricks Marketplace consumer listings:
|
| Commands for managing personalization requests, which allow customers to interact with the individualized Marketplace listing flow.
|
| Commands for managing providers, which are the entities that publish listings to the Marketplace.
|
| Commands for managing Marketplace exchanges filters, which curate which groups can access an exchange.
|
| Commands for managing Marketplace exchanges, which allow providers to share their listings with a curated set of customers.
|
| Commands for managing Marketplace files, such as preview notebooks and provider icons.
|
| Commands for managing provider listings:
|
| Commands for managing personalization requests, which are an alternate to instantly available listings.
|
| Commands to manage the templated analytics for providers:
|
| Commands to manage providers, which manage assets in Marketplace.
|
Apps commands
Group | Description and commands |
---|---|
| Commands to manage Databricks apps, which run directly on your Databricks instance, integrate with your data, use and extend Databricks services, and enable users to interact through single sign-on.
|
Clean rooms commands
Group | Description and commands |
---|---|
| Commands to manage clean room assets, which are data and code objects such as tables, volumes, and notebooks that are shared with the clean room.
|
| Commands to manage clean room task runs, which are the executions of notebooks in a clean room.
|
| Commands to manage clean rooms. Clean rooms use Delta Sharing and serverless compute to provide a secure and privacy-protecting environment where multiple parties can work together on sensitive enterprise data without direct access to each other’s data.
|
Additional commands
Group | Description and commands |
---|---|
Commands to make requests to the Databricks REST API:
| |
Commands to generate the autocompletion script for the specified shell:
| |
Configure the Databricks CLI. | |
| Output usage information for any command. |
Commands to manage Databricks Labs installations:
| |
| Retrieve the version of the CLI currently being used. |
Global flags
The following flags are available to all Databricks CLI commands. Note that some flags do not apply to some commands. For detailed information about specific commands and their flags, see the command-line help.
Flag | Description |
---|---|
| Display help for the Databricks CLI or the related command group or the related command. |
| A string representing the bundle environment to use if applicable for the related command. |
| A string representing the to write output logs to. If this flag is not specified then the default is to write output logs to stderr. |
|
|
| A string representing the log format level. If not specified then the log format level is disabled. |
|
|
| A string representing the named configuration profile to use within your |
| The format for progress logs to display ( |