Skip to main content

Databricks SQL release notes

Applies to: check marked yes AWS check marked yes GCP check marked yes Azure

The following Databricks SQL features and improvements were released in 2025.

October 2025

Databricks SQL version 2025.35 is now available in Preview

October 30, 2025

Databricks SQL version 2025.35 is now available in the Preview channel. Review the following section to learn about new features, behavioral changes, and bug fixes.

EXECUTE IMMEDIATE using constant expressions

You can now pass constant expressions as the SQL string and as arguments to parameter markers in EXECUTE IMMEDIATE statements.

LIMIT ALL support for recursive CTEs

You can now use LIMIT ALL to remove the total size restriction on recursive common table expressions (CTEs).

st_dump function support

You can now use the st_dump function to get an array containing the single geometries of the input geometry.

Polygon interior ring functions are now supported

You can now use the following functions to work with polygon interior rings:

  • st_numinteriorrings: Get the number of inner boundaries (rings) of a polygon.
  • st_interiorringn: Extract the n-th inner boundary of a polygon and return it as a linestring.

Add metadata column to DESCRIBE QUERY and DESCRIBE TABLE

SAP Databricks now includes a metadata column in the output of DESCRIBE QUERY and DESCRIBE TABLE for semantic metadata.

For DESCRIBE QUERY, when describing a query with metric views, semantic metadata propagates through the query if dimensions are directly referenced and measures use the MEASURE() function.

For DESCRIBE TABLE, the metadata column appears only for metric views, not other table types.

Default mode change for FSCK REPAIR TABLE command

The FSCK REPAIR TABLE command now includes an initial metadata repair step that validates checkpoints and partition values before removing references to missing data files.

Correct handling of null structs when dropping NullType columns

When writing to Delta tables, SAP Databricks now correctly preserves null struct values when dropping NullType columns from the schema. Previously, null structs were incorrectly replaced with non-null struct values where all fields were set to null.

New alert editing experience

October 20, 2025

Creating or editing an alert now opens in the new multi-tab editor, providing a unified editing workflow.

Visualizations fix

October 9, 2025

Legend selection now works correctly for charts with aliased series names in SQL editor and notebooks.

September 2025

Databricks SQL version 2025.30 is now available in Preview

September 25, 2025

Databricks SQL version 2025.30 is now available in the Preview channel. Review the following section to learn about new features, behavioral changes, and bug fixes.

UTF8 based collations now support LIKE operator

You can now use LIKE with columns that have one of the following collations enabled: UTF8_Binary, UTF8_Binary_RTRIM, UTF8_LCASE, UTF8_LCASE_RTRIM.

ST_ExteriorRing function is now supported

You can now use the ST_ExteriorRing function to extract the outer boundary of a polygon and return it as a linestring.

Declare multiple session or local variables in a single DECLARE statement

You can now declare multiple session or local variables of the same type and default value in a single DECLARE statement.

Support TEMPORARY keyword for metric view creation

You can now use the TEMPORARY keyword when creating a metric view. Temporary metric views are visible only in the session that created them and are dropped when the session ends.

DESCRIBE CONNECTION shows environment settings for JDBC connections

SAP Databricks now includes user-defined environment settings in the DESCRIBE CONNECTION output for JDBC connections that support custom drivers and run in isolation. Other connection types remain unchanged.

Correct results for split with empty regex and positive limit

SAP Databricks now returns correct results when using split function with an empty regex and a positive limit. Previously, the function incorrectly truncated the remaining string instead of including it in the last element.

Fix url_decode and try_url_decode error handling in Photon

In Photon, try_url_decode() and url_decode() with failOnError = false now return NULL for invalid URL-encoded strings instead of failing the query.

August 2025

Default warehouse setting is now available in Preview

August 28, 2025

Set a default warehouse that will be automatically selected in the compute selector across the SQL editor, Alerts, and Catalog Explorer. Individual users can override this setting by selecting a different warehouse before running a query. They can also define their own user-level default warehouse to apply across their sessions.

Databricks SQL version 2025.25 is rolling out in Current

August 21, 2025

Databricks SQL version 2025.25 is rolling out to the Current channel from August 20th, 2025 to August 28th, 2025. See features in 2025.25.

Databricks SQL version 2025.25 is now available in Preview

August 14, 2025

Databricks SQL version 2025.25 is now available in the Preview channel. Review the following section to learn about new features and behavioral changes.

Recursive common table expressions (rCTE) are generally available

Recursive common table expressions (rCTEs) are generally available. Navigate hierarchical data using a self-referencing CTE with UNION ALL to follow the recursive relationship.

Support for schema and catalog level default collation

You can now set a default collation for schemas and catalogs. This allows you to define a collation that applies to all objects created within the schema or catalog, ensuring consistent collation behavior across your data.

Support for Spatial SQL expressions and GEOMETRY and GEOGRAPHY data types

You can now store geospatial data in built-in GEOMETRY and GEOGRAPHY columns for improved performance of spatial queries. This release adds more than 80 new spatial SQL expressions, including functions for importing, exporting, measuring, constructing, editing, validating, transforming, and determining topological relationships with spatial joins.

Support for schema and catalog level default collation

You can now set a default collation for schemas and catalogs. This allows you to define a collation that applies to all objects created within the schema or catalog, ensuring consistent collation behavior across your data.

Better handling of JSON options with VARIANT

The from_json and to_json functions now correctly apply JSON options when working with top-level VARIANT schemas. This ensures consistent behavior with other supported data types.

Support for TIMESTAMP WITHOUT TIME ZONE syntax

You can now specify TIMESTAMP WITHOUT TIME ZONE instead of TIMESTAMP_NTZ. This change improves compatibility with the SQL Standard.

Resolved subquery correlation issue

SAP Databricks no longer incorrectly correlates semantically equal aggregate expressions between a subquery and its outer query. Previously, this could lead to incorrect query results.

Error thrown for invalid CHECK constraints

SAP Databricks now throws an AnalysisException if a CHECK constraint expression cannot be resolved during constraint validation.

New SQL editor is generally available

August 14, 2025

The new SQL editor is now generally available. The new SQL editor provides a unified authoring environment with support for multiple statement results, inline execution history, real-time collaboration, enhanced Databricks Assistant integration, and additional productivity features.

July 2025

Preset date ranges for parameters in the SQL editor

July 31, 2025

In the new SQL editor, you can now choose from preset date ranges—such as This week, Last 30 days, or Last year when using timestamp, date, and date range parameters. These presets make it faster to apply common time filters without manually entering dates.

Inline execution history in SQL editor

July 24, 2025

Inline execution history is now available in the new SQL editor, allowing you to quickly access past results without re-executing queries. Easily reference previous executions, navigate directly to past query profiles, or compare run times and statuses—all within the context of your current query.

Databricks SQL version 2025.20 is now available in Current

July 17, 2025

Databricks SQL version 2025.20 is rolling out in stages to the Current channel. For features and updates in this release, see 2025.20 features.

SQL editor updates

July 17, 2025

  • Improvements to named parameters: Date-range and multi-select parameters are now supported.

  • Updated header layout in SQL editor: The run button and catalog picker have moved to the header, creating more vertical space for writing queries.

Git support for alerts

July 17, 2025

You can now use Databricks Git folders to track and manage changes to alerts. To track alerts with Git, place them in a Databricks Git folder. Newly cloned alerts only appear in the alerts list page or API after a user interacts with them. They have paused schedules and need to be explicitly resumed by users.

Databricks SQL version 2025.20 is now available in Preview

July 3, 2025

Databricks SQL version 2025.20 is now available in the Preview channel. Review the following section to learn about new features and behavioral changes.

SQL procedure support

SQL scripts can now be encapsulated in a procedure stored as a reusable asset in Unity Catalog. You can create a procedure using the CREATE PROCEDURE command, and then call it using the CALL command.

Set a default collation for SQL Functions

Using the new DEFAULT COLLATION clause in the CREATE FUNCTION command defines the default collation used for STRING parameters, the return type, and STRING literals in the function body.

Recursive common table expressions (rCTE) support

SAP Databricks now supports navigation of hierarchical data using recursive common table expressions (rCTEs). Use a self-referencing CTE with UNION ALL to follow the recursive relationship.

Support ALL CATALOGS in SHOW SCHEMAS

The SHOW SCHEMAS syntax is updated to accept the following syntax:

SHOW SCHEMAS [ { FROM | IN } { catalog_name | ALL CATALOGS } ] [ [ LIKE ] pattern ]

When ALL CATALOGS is specified in a SHOW query, the execution iterates through all active catalogs that support namespaces using the catalog manager (DsV2). For each catalog, it includes the top-level namespaces.

The output attributes and schema of the command have been modified to add a catalog column indicating the catalog of the corresponding namespace. The new column is added to the end of the output attributes, as shown below:

Previous output

| Namespace        |
|------------------|
| test-namespace-1 |
| test-namespace-2 |

New output

| Namespace        | Catalog        |
|------------------|----------------|
| test-namespace-1 | test-catalog-1 |
| test-namespace-2 | test-catalog-2 |

Liquid clustering now compacts deletion vectors more efficiently

Delta tables with Liquid clustering now apply physical changes from deletion vectors more efficiently when OPTIMIZE is running.

Allow non-deterministic expressions in UPDATE/INSERT column values for MERGE operations

SAP Databricks now allows the use of non-deterministic expressions in updated and inserted column values of MERGE operations. However, non-deterministic expressions in the conditions of MERGE statements are not supported.

For example, you can now generate dynamic or random values for columns:

MERGE INTO target USING source
ON target.key = source.key
WHEN MATCHED THEN UPDATE SET target.value = source.value + rand()

This can be helpful for data privacy by obfuscating actual data while preserving the data properties (such as mean values or other computed columns).

Support VAR keyword for declaring and dropping SQL variables

SQL syntax for declaring and dropping variables now supports the VAR keyword in addition to VARIABLE. This change unifies the syntax across all variable-related operations, which improves consistency and reduces confusion for users who already use VAR when setting variables.

June 2025

Databricks SQL Serverless engine upgrades

June 11, 2025

The following engine upgrades are now rolling out globally, with availability expanding to all regions over the coming weeks.

  • Lower latency: Mixed workloads now run faster, with up to 25% improvement. The upgrade is automatically applied to serverless SQL warehouses with no additional cost or configuration.
  • Predictive Query Execution (PQE): PQE monitors tasks in real time and dynamically adjusts query execution to help avoid skew, spills, and unnecessary work.
  • Photon vectorized shuffle: Keeps data in compact columnar format, sorts it within the CPU's high-speed cache, and processes multiple values simultaneously using vectorized instructions. This improves throughput for CPU-bound workloads such as large joins and wide aggregation.

User interface updates

June 5, 2025

  • Query insights: Visiting the Query History page now emits the listHistoryQueries event. Opening a query profile now emits the getHistoryQuery event.