November 2025
These features and Databricks platform improvements were released in November 2025.
Releases are staged. Your Databricks account might not be updated until a week or more after the initial release date.
Real-time collaboration in notebook cells, files, and SQL editor
November 21, 2025
Real-time collaboration is now available in notebook cells, files, and the new SQL editor. Multiple users can edit the same cell simultaneously and can view each other's edits. For more information, see Collaborate using Databricks notebooks.
Use Git CLI commands in Git folders (Beta)
November 21, 2025
You can now run standard Git commands directly from the Databricks web terminal in Git folders. See Use Git CLI commands (Beta).
Configure compute size for Databricks Apps (Public Preview)
November 20, 2025
You can now configure the compute size for your Databricks apps to control CPU, memory, and cost based on your workload requirements. For more information, see Configure the compute size for a Databricks app.
Automatic email notifications for expiring personal access tokens (Public Preview)
November 20, 2025
Databricks now sends automatic email notifications to workspace users approximately 7 days before their personal access tokens expire. For more information, see Set the maximum lifetime of new personal access tokens.
Google Gemini 3 Pro Preview now available as a Databricks-hosted model
November 19, 2025
Mosaic AI Model Serving now supports Google Gemini 3 Pro Preview as a Databricks-hosted model.
To access this model, use:
Databricks Runtime maintenance updates (11/18)
November 18, 2025
New maintenance updates are available for supported Databricks Runtime versions. These updates include bug fixes, security patches, and performance improvements. For details, see Databricks Runtime maintenance updates.
New region: me-central2 (Dammam, Saudi Arabia)
November 18, 2025
Databricks is now available in the GCP region me-central2, Dammam, Saudi Arabia. See Databricks clouds and regions.
Jobs can now be triggered on source view update
November 17, 2025
Table update triggers for jobs now support views as a source, when the view references supported table types. See Trigger jobs when source tables are updated.
Delta Sharing views automatically permitted in serverless network policies
November 17, 2025
Delta Sharing views are now permitted when using serverless network policies, regardless of storage location alignment between shared tables and view dependencies. This removes the previous limitation and simplifies network policy configuration for Delta Sharing workloads.
See Manage network policies for serverless egress control.
ABAC-secured assets shared using Delta Sharing is in Public Preview
November 17, 2025
Applying attribute-based access control (ABAC) policies to tables and schemas shared through Delta Sharing is now in Public Preview. However, recipients can't apply ABAC policies on shared streaming tables.
See Add tables and schemas secured by ABAC policies to a share and Read data assets secured by ABAC policies.
Databricks SQL alerts are now in Public Preview
November 14, 2025
The latest version of Databricks SQL alerts, with a new editing experience, is now in Public Preview. See Databricks SQL alerts.
Improved full refresh flow for SQL Server ingestion pipelines
November 14, 2025
Full refresh operations for SQL Server ingestion pipelines now reduce downtime by delaying the table refresh until after the snapshot is completed. During the snapshot process, destination streaming tables remain available with their existing data. New inserts, deletes, and updates are accumulated but not immediately applied. After the snapshot completes, the full refresh and other accumulated changes are applied in a single update, minimizing disruption to data availability. This improvement helps reduce PENDING_RESET and timeout errors during full refresh operations. See Full refresh behavior.
OpenAI GPT-5.1 now available as a Databricks-hosted model
November 14, 2025
Mosaic AI Model Serving now supports OpenAI GPT-5.1 as a Databricks-hosted model. You can access this model using,
Connect Genie spaces to Microsoft Copilot Studio (Public Preview)
November 14, 2025
Connecting Databricks Genie spaces as tools in Microsoft Copilot Studio agents is in Public Preview. To use this feature, enable the Managed MCP Servers preview in your workspace and add your Genie space as a tool when configuring a Copilot Studio agent. See Connect to Databricks from Microsoft Power Platform.
SFTP connector (Public Preview)
November 13, 2025
The SFTP connector in Lakeflow Connect (Public Preview) extends Auto Loader functionality to ingest files from SFTP servers. See Ingest files from SFTP servers.
Convert foreign tables to Unity Catalog managed or external tables (Public Preview)
November 13, 2025
You can now convert foreign tables from external catalogs (such as Hive Metastore or AWS Glue) to Unity Catalog managed or external tables while retaining table history and configurations. Use ALTER TABLE SET MANAGED to convert to managed tables with MOVE (disable source access) or COPY (keep source accessible) options, or use ALTER TABLE SET EXTERNAL to convert to external tables with an optional DRY RUN validation. Additionally, you can convert entire foreign catalogs to standard Unity Catalog catalogs using ALTER CATALOG DROP CONNECTION. See Convert a foreign table to a managed Unity Catalog table and Convert a foreign table to an external Unity Catalog table.
Minimum Databricks Runtime versions:
ALTER TABLE SET MANAGED: DBR 17.3 LTS or aboveALTER TABLE SET EXTERNAL: DBR 17.0 LTS or aboveALTER CATALOG DROP CONNECTION: DBR 17.3 LTS or aboveALTER SCHEMA SET MANAGED LOCATION: DBR 16.4 LTS or above
JDBC Unity Catalog connection in Beta
November 13, 2025
Databricks now supports connecting to external databases using JDBC with Unity Catalog connections in Beta. You can use a JDBC Unity Catalog connection to read and write to a data source with the Spark Data Source API or Databricks Remote Query SQL API. Key benefits include:
- Governed access to data sources using Unity Catalog connections
- Create the connection one time and reuse it across any Unity Catalog compute type: serverless, standard clusters, dedicated clusters, and Databricks SQL
- Stable for Spark and compute upgrades
- Connection credentials are hidden from the querying user
The JDBC connection is available on Databricks Runtime 17.3 and above on standard or dedicated access mode, or on serverless compute. For more information, see JDBC Unity Catalog connection.
Lakeflow Declarative Pipelines has been renamed to Lakeflow Spark Declarative Pipelines
November 12, 2025
Lakeflow Declarative Pipelines has been renamed to Lakeflow Spark Declarative Pipelines. See Lakeflow Spark Declarative Pipelines.
Migrate Community Edition workspaces to Free Edition (Public Preview)
November 7, 2025
The migration tool for migrating Community Edition workspaces to Free Edition is now available. See Migrate Community Edition workspaces to Free Edition.
SAP BDC connector for Databricks supports cross-cloud sharing
November 6, 2025
You can now share data between different cloud environments using the SAP Business Data Cloud (BDC) Connector for Databricks.
See Share data between SAP Business Data Cloud (BDC) and Databricks.
List MCP servers in Databricks Marketplace (Public Preview)
November 6, 2025
Databricks Marketplace now supports listing Model Context Protocol (MCP) servers. Providers can list MCP servers to distribute AI tools and other Databricks users can install MCP servers from Marketplace to connect AI agents to external data sources, SaaS tools, and developer APIs.
See List your data product in Databricks Marketplace and Get access to external MCP servers.
New MCP server tab
November 6, 2025
The new MCP Servers tab in the workspace shows your managed and external MCP servers, and a curated list of MCP servers available to install from Databricks Marketplace.
See Model Context Protocol (MCP) on Databricks.
Updated tag search syntax in workspace search
November 5, 2025
The workspace search syntax for searching tables and views by tag has been updated. To search by both tag key and value, use the syntax tag:<tag_key>:<tag_value>. Previously, the syntax was to omit tag: when searching by both key and value. To search by tag key alone, continue using tag:<tag_key>. See Use tags to search for tables and views.
Serverless notebook tasks can now use jobs environment
November 5, 2025
The ability to use a job environment in serverless notebook tasks is now generally available, allowing you to use the notebook's environment or select a specific job environment. Previously, serverless notebook tasks always inherited the serverless environment from the notebook. See Configure environment for job tasks.
Databricks Runtime maintenance updates (11/04)
November 4, 2025
New maintenance updates are available for supported Databricks Runtime versions. These updates include bug fixes, security patches, and performance improvements. For details, see Databricks Runtime maintenance updates.
GCP Private Service Connect generally available
November 3, 2025
Google Cloud Private Service Connect (PSC) for Databricks workspaces is now generally available. You can enable PSC without requesting account enablement from your Databricks account team. See Enable Private Service Connect for your workspace.
Attribute-based access control (ABAC) now in Public Preview
November 3, 2025
Attribute-based access control (ABAC) is now in Public Preview. ABAC is a data governance model that provides flexible, scalable, and centralized access control by defining policies based on governed tags applied to data assets.
In Public Preview, ABAC includes two key changes from the Beta:
-
Account-level enforcement: ABAC has transitioned from a workspace-level preview to an account-level preview. ABAC policies are now enforced across all workspaces attached to an account’s metastore by default. See ABAC Beta to Public Preview transition.
-
Automatic type casting for column masks: Databricks now automatically casts the output of column mask functions defined in ABAC policies to match the target column’s data type. This ensures consistent typing and more reliable query behavior. Existing column mask functions might fail if their return types are incompatible with the target column’s type. See ABAC column mask type casting
For more information, see Unity Catalog attribute-based access control (ABAC).
Audit logs for request-for-access events are now emitted
November 3, 2025
Actions related to access request destinations (Public Preview) are now emitted in the audit logs. For more details, see Request for access events.
Disable DBFS root and mounts workspace setting is GA
November 3, 2025
The workspace setting that allows workspace admins to disable access to the Databricks Filesystem (DBFS) root and mounts in existing Databricks workspaces is now generally available. See Disable access to DBFS root and mounts in your existing Databricks workspace.
TISAX compliance controls
November 3, 2025
TISAX compliance controls provide enhancements that help you with compliance for your workspace. TISAX (Trusted Information Security Assessment Exchange) is an assessment and exchange mechanism defined by the ENX Association for the German automotive industry, based on ISO/IEC 27001 and VDA ISA requirements. See Trusted Information Security Assessment Exchange (TISAX).
JAR tasks are now supported on serverless compute
November 3, 2025
You can now run JAR jobs on serverless compute. See JAR task for jobs.