SQL language reference
This is a SQL command reference for Databricks SQL and Databricks Runtime.
For information about using SQL with Delta Live Tables, see Delta Live Tables SQL language reference.
“Applies to” label
The SQL reference encompasses both Databricks SQL and Databricks Runtime. Near the top of each article is a label that indicates which products and versions are supported for that article.
For example, this article applies to all versions for both Databricks SQL and Databricks Runtime:
Applies to: Databricks SQL
Databricks Runtime
In this example, this article applies to all version of Databricks SQL and Databricks Runtime versions 11.1 and above:
Applies to: Databricks SQL
Databricks Runtime 11.1 and above
In this example, this article just applies to Databricks SQL and is not supported in Databricks Runtime:
Applies to: Databricks SQL
In some cases, an article mostly applies to a product, but certain parameters within an article are only supported by one product or specific versions of a product. In this case, the Applies to label is inserted into the appropriate parameter in the body of the article. For example:
The top of an article might state: Applies to: Databricks SQL
Databricks Runtime
One specific parameter within the article might indicate Applies to: Databricks SQL
Databricks Runtime 11.1 and above. This means that most of the document is supported by all versions of both Databricks SQL and Databricks Runtime, and that specific parameter is supported by all versions of Databricks SQL and only Databricks Runtime versions 11.1 and above.
Another specific parameter within the article might indicate Applies to: Databricks Runtime 11.1 and above. This means that most of the document is supported by all versions of Databricks SQL and Databricks Runtime, but that specific parameter is not supported by Databricks SQL and is only supported in Databricks Runtime versions 11.1 and above.
General reference
This general reference describes data types, functions, identifiers, literals, and semantics:
- How to read a syntax diagram
- Configuration parameters
- Data types and literals
- Functions
- SQL data type rules
- Datetime patterns
- H3 geospatial functions
- Lambda functions
- Window functions
- Identifiers
- Names
- Null semantics
- Expressions
- Name resolution
- JSON path expressions
- Partitions
- ANSI compliance
- Apache Hive compatibility
- Principals
- Privileges and securable objects in Unity Catalog
- Privileges and securable objects in the Hive metastore
- External locations
- External tables
- Storage credentials
- Delta Sharing
- Information schema
- Reserved words
DDL statements
You use data definition statements to create or modify the structure of database objects in a database:
- ALTER CATALOG
- ALTER CREDENTIAL
- ALTER DATABASE
- ALTER LOCATION
- ALTER PROVIDER
- ALTER RECIPIENT
- ALTER TABLE
- ALTER SCHEMA
- ALTER SHARE
- ALTER VIEW
- COMMENT ON
- CREATE BLOOMFILTER INDEX
- CREATE CATALOG
- CREATE DATABASE
- CREATE FUNCTION (SQL)
- CREATE FUNCTION (External)
- CREATE LOCATION
- CREATE RECIPIENT
- CREATE SCHEMA
- CREATE SHARE
- CREATE TABLE
- CREATE VIEW
- DROP BLOOMFILTER INDEX
- DROP CATALOG
- DROP DATABASE
- DROP CREDENTIAL
- DROP FUNCTION
- DROP LOCATION
- DROP PROVIDER
- DROP RECIPIENT
- DROP SCHEMA
- DROP SHARE
- DROP TABLE
- DROP VIEW
- MSCK REPAIR TABLE
- SYNC
- TRUNCATE TABLE
DML statements
You use data manipulation statements to add, change, or delete data from a Delta Lake table:
Data retrieval statements
You use a query to retrieve rows from one or more tables according to the specified clauses. The full syntax
and brief description of supported clauses are explained in the Query article.
The related SQL statements SELECT
and VALUES
are also included in this section.
Databricks SQL also provides the ability to generate the logical and physical plan for a query using the EXPLAIN
statement.
Delta Lake statements
You use Delta Lake SQL statements to manage tables stored in Delta Lake format:
For details on using Delta Lake statements, see What is Delta Lake?.
Auxiliary statements
You use auxiliary statements to collect statistics, manage caching, explore metadata, set configurations, and manage resources:
Apache Spark Cache statements
Applies to: Databricks Runtime
Show statements
- LIST
- SHOW ALL IN SHARE
- SHOW CATALOGS
- SHOW COLUMNS
- SHOW CREATE TABLE
- SHOW CREDENTIALS
- SHOW DATABASES
- SHOW FUNCTIONS
- SHOW GROUPS
- SHOW LOCATIONS
- SHOW PARTITIONS
- SHOW PROVIDERS
- SHOW RECIPIENTS
- SHOW SCHEMAS
- SHOW SHARES
- SHOW SHARES IN PROVIDER
- SHOW TABLE
- SHOW TABLES
- SHOW TBLPROPERTIES
- SHOW USERS
- SHOW VIEWS
Resource management
Applies to: Databricks Runtime
Security statements
You use security SQL statements to manage access to data:
For details using these statements, see Data object privileges.