Skip to main content
RSS Feed

July 2025

These features and Databricks platform improvements were released in July 2025.

note

Releases are staged. Your Databricks account might not be updated until a week or more after the initial release date.

New compute policy form (Public Preview)

July 31, 2025

The new compute policy form uses UI elements to help you configure policy definitions, making it simpler to write compute policies in the UI.

New compute policy form

The new form includes the following changes:

  • New definition dropdown menus allow you to configure rules without needing to reference the policy syntax.
  • Max compute resources per user, max DBUs per hour, and cluster type settings have moved under the Advanced options section.
  • Tagging definitions now have their own separate section.
  • Policy permission settings have moved out of the policy form and are now set using the permissions modal in the policy overview page.

See Configure policy definitions using the new policy form (Public Preview).

Delta Sharing supports sharing tables and schemas secured by ABAC policies (Beta)

July 31, 2025

Delta Sharing providers can now add tables and schemas secured by attribute-based access control (ABAC) to a Delta share. The policy does not govern the recipient's access, so recipients have full access to the shared asset. Recipients can apply their own ABAC policies.

See Add tables and schemas secured by ABAC policies to a share and Read data assets secured by ABAC policies.

Sharing streaming tables and materialized views is GA

July 30, 2025

Using Delta Sharing to share streaming tables and materialized views is generally available. There are fewer limitations for share recipients and providers when sharing streaming tables and materialized views.

See Add streaming tables to a share, Add materialized views to a share, and Read shared streaming tables and materialized views.

Jobs & Pipelines list now includes Databricks SQL pipelines

July 29, 2025

The Jobs & Pipelines list now includes pipelines for materialized views and streaming tables that were created with Databricks SQL.

Organization name required to enable Delta Sharing on metastore

July 29, 2025

When enabling Delta Sharing on your metastore, an organization name must be specified if you are sharing data with a Databricks recipient not in your account. When possible, existing provider names without an organization name are automatically updated to include account details, aiming to make them more readable. A readable organization name helps recipients to identify their share providers.

See Enable Delta Sharing on a metastore and View providers.

One-time job runs now correctly record the job name in usage system table

July 28, 2025

The usage_metadata.job_name value in the system.billing.usage table now contains the run names for runs triggered through the one-time run API. If a run name isn't provided in the request body, the job_name field is recorded as Untitled.

Serverless compute runtime updated to 17.0

July 28, 2025

Serverless compute for notebooks and jobs now uses an upgraded runtime, which roughly corresponds to Databricks Runtime 17.0. See Serverless compute release notes.

Disable DBFS root and mounts is in Public Preview

July 28, 2025

You can now disable access to the Databricks Filesystem (DBFS) root and mounts in existing Databricks workspaces. See Disable access to DBFS root and mounts in your existing Databricks workspace.

Configure single-sign on with your identity provider (Public Preview)

July 27, 2025

By default, single sign-on using Google Cloud Identity is enabled in Databricks. You can now choose to bring your own identity provider, such as Okta or Microsoft Entra ID, to configure single sign-on to Databricks. See Configure SSO in Databricks.

Improvements to the notebook editing experience

July 25, 2025

The following improvements have been made to the notebook editing experience:

  • Add a split view to edit notebooks side by side. See Edit notebooks side by side.
  • Pressing Cmd + F (Mac) or Ctrl + F (Windows) in a notebook now opens the native Databricks find-and-replace tool. This allows you to quickly search and replace text throughout your entire notebook, including content outside the current viewport. See Find and replace text.
  • Quickly switch between tab groups based on authoring contexts using the Home icon., Query editor icon., and Pipeline icon. icons on the top left in the editor. See Switch between authoring contexts.

New columns available in query history system table

July 24, 2025

New columns are now available in the query history system table, providing additional query insights:

  • cache_origin_statement_id: For query results fetched from cache, this field contains the statement ID of the query that originally inserted the result into the cache.
  • query_parameters: A struct containing named and positional parameters used in parameterized queries.
  • written_rows: The number of rows of persistent data written to cloud object storage.
  • written_files: Number of files of persistent data written to cloud object storage.

See Query history system table reference.

Moving tables between Lakeflow Declarative Pipelines pipelines is now GA

July 24, 2025

Tables created by Lakeflow Declarative Pipelines in Unity Catalog ETL pipelines can be moved from one pipeline to another. See Move tables between Lakeflow Declarative Pipelines.

Dynamic partition overwrite with INSERT REPLACE USING (Public Preview)

July 23, 2025

INSERT REPLACE USING is now in Public Preview for Databricks Runtime 16.3. This SQL command replaces part of the table with the result of a query. Rows are replaced when all columns listed in USING match with the = operator.

For Public Preview, columns in the USING clause must be the full set of the table's partition columns.

See INSERT in the SQL language reference and Selectively overwrite data with Delta Lake.

Selectively replace data with INSERT REPLACE ON (Public Preview)

July 23, 2025

INSERT REPLACE ON is now in Public Preview for Databricks Runtime 17.1. This SQL command replaces part of the table with the result of a query. Rows are replaced when they match the user-defined condition.

See INSERT in the SQL language reference.

Real-time mode in Structured Streaming (Public Preview)

July 22, 2025

You can now use real-time mode, a trigger type for Structured Streaming that enables sub-second latency data processing. This mode is designed for operational workloads that require immediate response to streaming data. See Real-time mode in Structured Streaming.

New region: southamerica-east1 (São Paulo, Brazil)

July 21, 2025

Databricks is now available in the GCP region southamerica-east1, located in São Paulo, Brazil. See Databricks clouds and regions.

PCI DSS compliance controls now available

July 16, 2025

PCI DSS (Payment Card Industry Data Security Standard) compliance controls are now available to help organizations meet PCI DSS requirements when processing and storing cardholder data. See PCI DSS v4.0.

Databricks documentation release notes feed

July 16, 2025

The Databricks documentation site now provides an RSS feed that contains updates to the product and other feature release notes. This feed can be used by any feed reader or client that can consume RSS, so that you can take advantage of feed reader features such as email notifications for Databricks product releases. See Databricks release notes feed.

Explore table data using an LLM (Beta)

July 16, 2025

You can now ask natural language questions about the sample data using Catalog Explorer. The Assistant generates the SQL based on metadata context and table usage patterns. After the query is generated, you can validate the query and then run it against the underlying table. See Explore table data using an LLM.

Databricks Runtime 17.1 and Databricks Runtime 17.1 ML are in Beta

July 15, 2025

Databricks Runtime 17.1 and Databricks Runtime 17.1 ML are now in Beta. These releases include improvements to streaming, SQL functions, and connector behavior, along with reliability and performance enhancements across the platform.

See Databricks Runtime 17.1 and Databricks Runtime 17.1 for Machine Learning.

Google Gemma 3 12B now available on Mosaic AI Model Serving

July 15, 2025

Google Gemma 3 12B is now available on Mosaic AI Model Serving as a Databricks-hosted foundation model. See Supported foundation models on Mosaic AI Model Serving.

Gemma 3 12B model supports text inputs for the following Model Serving features:

  • Foundation Model APIs pay-per-token.
  • Foundation Model APIs provisioned throughput.
  • AI Functions. Both real time inference and batch inference workloads are supported.

The CAN VIEW permission on SQL warehouses is now generally available

July 15, 2025

The CAN VIEW permission allows users to view SQL warehouses, including query history and query profiles. These users cannot run queries on the warehouse.

See SQL warehouse ACLs.

Simplified compute form enabled by default

July 15, 2025

The simplified compute creation form is now enabled by default when you create all-purpose or jobs compute in the Databricks UI.

See Use the simple form to manage compute.

Serverless notebooks: Restore Python variables after idle termination

July 14, 2025

Databricks now snapshots your notebook’s Python variables before terminating idle serverless compute. When you reconnect, your notebook is automatically restored from its snapshot, letting you continue your work seamlessly.

See Automated session restoration for serverless notebooks.

Must be metastore admin to transfer share ownership for Delta Sharing

July 14, 2025

To change the ownership of a share for Delta Sharing, you must now be the metastore admin. Share owners can no longer transfer ownership. See Update shares.

Git support for alerts

July 11, 2025

You can now use Databricks Git folders to track and manage changes to alerts. To track alerts with Git, place them in a Databricks Git folder. Newly cloned alerts only appear in the alerts list page or API after a user interacts with them. They have paused schedules and need to be explicitly resumed by users.

See How Git integration works with alerts.

Databricks connector for Power BI now supports the ADBC driver (Public Preview)

July 11, 2025

You can set the Databricks connector for Power BI to use the Arrow Database Connectivity (ADBC) driver instead of the ODBC driver.

See Arrow Database Connectivity (ADBC) driver for Power BI.

Serverless compute is now available in northamerica-northeast1, europe-west1, and asia-northeast1

July 11, 2025

Serverless compute for notebooks, workflows, and Lakeflow Declarative Pipelines is now available in the northamerica-northeast1, europe-west1, and asia-northeast1 regions. See Connect to serverless compute.

MLfLow support for compliance security profile standards

July 8, 2025

MLflow now supports all compliance security profile standards supported by Databricks, such as HIPAA. For a full list of supported compliance standards, see Compliance security profile.

Parent tasks (Run job and For each) now have a separate limit

July 4, 2025

Tasks that wait on child processes (Run job and For each tasks) now have a separate limit for the number of tasks that can run simultaneously, and do not count against the overall limit.

See Resource limits.

Git folders now supports multiple Git credentials per user (Public Preview)

July 2, 2025

Use the UI to add and manage multiple Git credentials in the workspace, from one or multiple Git providers.

See Set up Databricks Git folders.