Skip to main content
RSS Feed

December 2025

These features and Databricks platform improvements were released in December 2025.

note

Releases are staged. Your Databricks account might not be updated until a week or more after the initial release date.

Databricks Assistant Agent Mode is now in Public Preview

December 23, 2025

The Databricks Assistant Agent Mode preview is now enabled by default for most customers.

  • The agent can automate multiple steps. From a single prompt, it can retrieve relevant assets, generate and run code, fix errors automatically, and visualize results. It adds the ability to sample data and cell outputs to provide better results.
  • The Assistant in Agent Mode will choose between Azure OpenAI or Anthropic on Databricks (uses endpoints hosted by Databricks Inc. in AWS within the Databricks security perimeter), and is only available when the partner-powered AI features setting is enabled.
  • Admins can disable the preview if needed until the feature reaches General Availability.

See Use the Data Science Agent, the blog post, and Partner-powered AI features.

Single-use refresh tokens for OAuth applications

December 22, 2025

You can now configure single-use refresh tokens for OAuth applications integrated with Databricks. This security feature requires token rotation after each use, enhancing protection for user-to-machine authentication flows. See Single-use refresh tokens.

Update request parameters for Delta Sharing recipient audit log events

December 19, 2025

For Delta Sharing recipients, deltaSharingProxy* audit log events now also include the catalog_name request parameter, in addition to share_name (previously named share). See Delta Sharing recipient events.

Anthropic Claude Haiku 4.5 now available as a Databricks-hosted model

December 19, 2025

Mosaic AI Model Serving now supports Anthropic Claude Haiku 4.5 as a Databricks-hosted model. You can access this model using Foundation Model APIs pay-per-token.

New Databricks accounts will not have access to legacy features

December 19, 2025

Databricks accounts created after December 18, 2025 will not have access to certain legacy features such as access to DBFS root and mounts, Hive Metastore, and No-isolation shared compute. These accounts will exclusively use Unity Catalog for unified governance and enterprise-grade security.

This behavior enforces the Disable legacy features account setting available in existing Databricks accounts. See Disable access to legacy features in new workspaces.

MySQL connector in Lakeflow Connect (Public Preview)

December 18, 2025

The fully-managed MySQL connector in Lakeflow Connect is in Public Preview. This connector enables incremental data ingestion from MySQL databases, including Amazon RDS for MySQL, Amazon Aurora MySQL, Azure Database for MySQL, Google Cloud SQL for MySQL, and MySQL on EC2. See Configure MySQL for ingestion into Databricks.

Contact your Databricks account team to request access to the preview.

Meta Ads connector (Beta)

December 18, 2025

You can now ingest data from Meta Ads. See Set up Meta Ads as a data source.

Lakebase Autoscaling metrics dashboard

December 18, 2025

Lakebase Autoscaling (Public Preview) now includes a Metrics dashboard for monitoring system and database metrics. See Metrics.

View latest scheduled notebook job results

December 18, 2025

Databricks notebooks can now show the latest scheduled notebook run directly in your notebook and notebook dashboards. You can also update the notebook with the latest run results.

For more details, see View last successful run and update notebook.

Connect to Lakebase Autoscaling from the SQL editor with read-write access

December 18, 2025

Lakebase Autoscaling (Public Preview) now supports direct connections from the SQL editor with full read-write access. See Query from SQL Editor in Lakehouse.

Context based ingress control is now in Public Preview

December 17, 2025

Context-based ingress control is now in Public Preview. This feature enables account admins to set allow and deny rules that combine who is calling, from where they are calling, and what they can reach in Databricks. Context-based ingress control ensures that only trusted combinations of identity, request type, and network source can reach your workspace. A single policy can govern multiple workspaces, ensuring consistent enforcement across your organization.

See Context-based ingress control.

Lakebase Autoscaling ACL support

December 17, 2025

Lakebase Autoscaling (Public Preview) now supports Access Control Lists (ACLs). Grant CAN CREATE or CAN MANAGE permissions to control who can access and manage project resources. Manage permissions from project settings in the Lakebase App. See Manage project permissions.

Gemini 3 Flash now available as a Databricks-hosted model

December 17, 2025

Gemini 3 Flash is now available as a Databricks-hosted model. This model offers speed and scale without compromising quality, with advanced multimodal capabilities for complex video analysis, data extraction, and visual Q&As. For more information, see Gemini 3 Flash.

Login required to download ODBC driver

December 17, 2025

You must now log in to Databricks and accept license terms before downloading the Simba Apache Spark ODBC Driver. See Download and install the Databricks ODBC Driver (Simba).

If you use Databricks on AWS GovCloud, contact your account team to receive access to the driver.

Flexible node types are now generally available

December 17, 2025

Flexible node types allow your compute resource to fall back to alternative, compatible instance types when your specified instance type is unavailable. This behavior improves compute launch reliability by reducing capacity failures during compute launches. See Improve compute launch reliability using flexible node types.

New resource types for Databricks Apps

December 17, 2025

You can now add MLflow experiments, vector search indexes, user-defined functions (UDFs), and Unity Catalog connections as Databricks Apps resources. See Add resources to a Databricks app.

Run read-only queries on Lakebase (Provisioned) readable secondaries from SQL editor

December 15, 2025

You can now connect to Lakebase (Provisioned) readable secondaries and run read-only queries from the Databricks SQL editor. See Execute read-only queries from Databricks SQL Editor and Access a database instance from the SQL editor.

Delta Sharing to external Iceberg clients is in Public Preview

December 15, 2025

You can now share tables, materialized views, and streaming tables to external Iceberg clients such as Snowflake, Trino, Flink, and Spark. External Iceberg clients can query shared Delta tables with zero-copy access. For details, see Enable sharing to external Iceberg clients and Iceberg clients: Read shared Delta tables.

Lakebase (Autoscaling) now in Public Preview

December 12, 2025

Lakebase (Autoscaling) is now in Public Preview on AWS. This new version of Lakebase introduces autoscaling compute, scale-to-zero, database branching, instant restore, and a redesigned project-based interface. To allow users to explore the new version, usage of Lakebase Autoscaling is free for a limited time. Billing for Lakebase Autoscaling usage begins in January 2026. See Get started with Lakebase Postgres (Autoscaling Preview).

Disable legacy features settings are now GA

December 11, 2025

To help migrate accounts and workspaces to Unity Catalog, two admin settings that disable legacy features are now generally available:

OpenAI GPT-5.2 now available as a Databricks-hosted model

December 11, 2025

Mosaic AI Model Serving now supports OpenAI GPT-5.2 as a Databricks-hosted model. You can access this model using Foundation Model APIs pay-per-token.

Confluence connector (Beta)

December 16, 2025

The fully-managed Confluence connector in Lakeflow Connect enables you to ingest Confluence spaces, pages, attachments, blogposts, labels, and classification levels into Databricks. See Configure OAuth U2M for Confluence ingestion.

PostgreSQL connector in Lakeflow Connect (Public Preview)

December 16, 2025

The fully-managed PostgreSQL connector in Lakeflow Connect is in Public Preview. This connector enables incremental data ingestion from PostgreSQL databases, including Amazon RDS PostgreSQL, Amazon Aurora PostgreSQL, Azure Database for PostgreSQL, Google Cloud SQL for PostgreSQL, and on-premises PostgreSQL databases. See Configure PostgreSQL for ingestion into Databricks.

Customizable SharePoint connector (Beta)

December 10, 2025

The standard SharePoint connector offers more flexibility than the managed SharePoint connector. It allows you to ingest structured, semi-structured, and unstructured files into Delta tables with full control over schema inference, parsing options, and transformations. To get started, see Ingest files from SharePoint.

For an in-depth comparison of the SharePoint connectors, see Choose your SharePoint connector.

NetSuite connector (Public Preview)

December 10, 2025

You can now ingest data from the NetSuite2.com data source programmatically using the Databricks API, the Databricks CLI, or a Databricks notebook. See Configure NetSuite for ingestion into Databricks.

Change owner for materialized views or streaming tables defined in Databricks SQL

December 10, 2025

You can now change the owner for materialized views or streaming tables defined in Databricks SQL through Catalog Explorer. For materialized view details, see Configure materialized views in Databricks SQL. For streaming table details, see Use streaming tables in Databricks SQL.

Discover files in Auto Loader efficiently using file events

December 10, 2025

Auto Loader with file events is now GA. With this feature, Auto Loader can discover files with the efficiency of notifications while retaining the setup simplicity of directory listing. This is the recommended way to use Auto Loader (and particularly file notifications) with Unity Catalog. Learn more here.

To start using Auto Loader with file events, see the following:

ForEachBatch for Lakeflow Spark Declarative Pipelines is available (Public Preview)

December 9, 2025

You can now process streams in Lakeflow Spark Declarative Pipelines as a series of micro-batches in Python, using a ForEachBatch sink. The ForEachBatch sink is available in public preview.

See Use ForEachBatch to write to arbitrary data sinks in pipelines.

Databricks Runtime 18.0 and Databricks Runtime 18.0 ML are in Beta

December 9, 2025

Databricks Runtime 18.0 and Databricks Runtime 18.0 ML are now in Beta, powered by Apache Spark 4.0.0. The release includes JDK 21 as the default, new features for jobs and streaming, and library upgrades.

See Databricks Runtime 18.0 (Beta) and Databricks Runtime 18.0 for Machine Learning (Beta).

Databricks Runtime maintenance updates (12/09)

December 9, 2025

New maintenance updates are available for supported Databricks Runtime versions. These updates include bug fixes, security patches, and performance improvements. For details, see:

New columns in Lakeflow system tables (Public Preview)

December 9, 2025

New columns are now available in the Lakeflow system tables to provide enhanced job monitoring and troubleshooting capabilities:

jobs table: trigger, trigger_type, run_as_user_name, creator_user_name, paused, timeout_seconds, health_rules, deployment, create_time

job_tasks table: timeout_seconds, health_rules

job_run_timeline table: source_task_run_id, root_task_run_id, compute, termination_type, setup_duration_seconds, queue_duration_seconds, run_duration_seconds, cleanup_duration_seconds, execution_duration_seconds

job_task_run_timeline table: compute, termination_type, task_parameters, setup_duration_seconds, cleanup_duration_seconds, execution_duration_seconds

pipelines table: create_time

These columns are not populated for rows emitted before early December 2025. See Jobs system table reference.

New token expiration policy for open Delta Sharing

December 8, 2025

All new Delta Sharing open sharing recipient tokens are issued with a maximum expiration of one year from the date of creation. Tokens with an expiration period longer than one year or no expiration date can no longer be created.

Existing open sharing recipient tokens issued before December 8, 2025, with expiration dates after December 8, 2026, or with no expiration date, automatically expire on December 8, 2026. If you currently use recipient tokens with long or unlimited lifetimes, review your integrations and renew tokens as needed to avoid breaking changes after this date.

See Create a recipient object for non-Databricks users using bearer tokens (open sharing).

Expanded regional availability for C5 and TISAX compliance

December 8, 2025

You can now use the Cloud Computing Compliance Criteria Catalogue (C5) and the Trusted Information Security Assessment Exchange (TISAX) compliance standards in all regions and with serverless compute. See Classic and serverless compute support by region.

Vector Search reranker is now generally available

December 8, 2025

The Vector Search reranker is now generally available. Reranking can help improve retrieval quality. For more information, see Use the reranker in a query.

Built-in Excel file format support (Beta)

December 2, 2025

Databricks now provides built-in support for reading Excel files. You can query Excel files directly using Spark DataFrames without external libraries. See Read Excel files.