September 2025
These features and Databricks platform improvements were released in September 2025.
Releases are staged. Your Databricks account might not be updated until a week or more after the initial release date.
Databricks Runtime maintenance updates
September 24, 2025
New maintenance updates are available for supported Databricks Runtime versions. These updates include bug fixes, security patches, and performance improvements. For details, see Databricks Runtime maintenance updates.
Route-optimized endpoints now require route-optimized URL path for querying
September 22, 2025
All newly created route-optimized endpoints must be queried using the route-optimized URL. Queries using the standard workspace URL path are not supported for route-optimized endpoints created after September 22, 2025.
See Query route-optimized serving endpoints.
The SAP Business Data Cloud (BDC) Connector for Databricks is generally available
September 19, 2025
The SAP BDC Connector enables secure, zero-copy data sharing between SAP BDC and a Unity Catalog-enabled Databricks workspace. Access and analyze SAP BDC data on Databricks, and share Databricks data assets back to SAP BDC for unified analytics across both platforms.
See Share data between SAP Business Data Cloud (BDC) and Databricks.
Explore table data using an LLM (Public Preview)
September 22, 2025
You can now ask natural language questions about sample data using Catalog Explorer. The Assistant uses metadata context and table usage patterns to generate a SQL query. You can then validate the query and run it against the underlying table. See Explore table data using an LLM.
Databricks One Public Preview
September 17, 2025
Databricks One, a simplified user interface designed for business users, is now in Public Preview. Databricks One provides a single, intuitive entry point to interact with data and AI in Databricks, without requiring technical knowledge of compute resources, queries, models, or notebooks.
With Databricks One, business users can:
- View and interact with AI/BI dashboards to track KPIs and analyze metrics.
- Ask data questions in natural language using AI/BI Genie.
- Use custom-built Databricks Apps that combine analytics, AI, and workflows.
Workspace admins can enable Databricks One from the Previews page in the admin console.
Databricks Runtime 17.2 is now is now GA
September 16, 2025
Databricks Runtime 17.2 is now generally available. See Databricks Runtime 17.2.
Delta Sharing on Lakehouse Federation is in Beta
September 16, 2025
You can now use Delta Sharing to share foreign schemas and tables created with query federation in Databricks-to-Databricks sharing and open sharing. See Add foreign schemas or tables to a share and Read data in a shared foreign table or foreign schema.
Mount Delta shares to an existing shared catalog
September 12, 2025
Delta Sharing recipients can now mount shares received from their Delta Sharing provider to an existing shared catalog. Previously, recipients needed to create a new catalog for each new share. See Create a catalog from a share.
Google Analytics Raw Data connector GA
September 10, 2025
The Google Analytics Raw Data connector in Lakeflow Connect is now generally available. See Set up Google Analytics 4 and Google BigQuery for Databricks ingestion.
Serverless jobs and pipelines configured in the UI are now performance optimized by default
September 11, 2025
Serverless jobs and pipelines configured in the UI are now set to Performance optimized by default. This aligns the UI with existing defaults across APIs, Terraform, and SDKs. For more information, see Performance mode in serverless jobs and Performance mode in serverless pipelines.
Python custom data sources can be used with Lakeflow Declarative Pipelines
September 10, 2025
You can use Python custom data sources and sinks in your pipeline definitions in Lakeflow Declarative Pipelines.
For information about Python custom data sources, see the following:
- Load data from a Python custom data source.
- Create a Lakeflow Declarative Pipelines sink.
- PySpark custom data sources.
Lakeflow Declarative Pipelines now supports stream progress metrics in Public Preview
September 10, 2025
Lakeflow Declarative Pipelines now supports querying the event log for metrics about the progress of a stream. See Monitor pipeline streaming metrics.
Databricks Runtime maintenance update
September 8, 2025
New maintenance updates are available for supported Databricks Runtime versions. These updates include bug fixes, security patches, and performance improvements. For details, see Databricks Runtime maintenance updates.
Databricks Apps support for Genie resources
September 8, 2025
Databricks Apps now supports adding an AI/BI Genie space as an app resource to enable natural language querying over curated datasets. See Add a Genie space resource to a Databricks app.
Databricks connector in Microsoft Power Platform is in Public Preview
September 5, 2025
Use Databricks data to build canvas apps in Power Apps, flows in Power Automate, and agents in Copilot Studio by creating a Databricks connection in Power Platform.
See Connect to Databricks from Microsoft Power Platform.
MLflow metadata is now available in system tables (Public Preview)
September 5, 2025
MLflow metadata is now available in system tables. View metadata managed within the MLflow tracking service from the entire workspace in one central location, taking advantage of all the lakehouse tooling Databricks offers, such as building custom AI/BI dashboards, SQL alerts, and large scale data analytic queries.
For information, see MLflow system tables reference.
Databricks Assistant Agent Mode: Data Science Agent is in Beta
September 3, 2025
Agent Mode for Databricks Assistant is now in Beta. In Agent Mode, the Assistant can orchestrate multi-step workflows from a single prompt.
The Data Science Agent is custom-built for data science workflows and can build an entire notebook for tasks like EDA, forecasting, and machine learning from scratch. Using your prompt, it can plan a solution, retrieve relevant assets, run code, use cell outputs to improve results, fix errors automatically, and more.
See Use the Data Science Agent.
C5 compliance controls
September 2, 2025
C5 compliance controls provide enhancements that help you with compliance for your workspace. C5 is a German Federal Office for Information Security (BSI) standard that defines minimum security requirements for cloud service providers. See Cloud Computing Compliance Criteria Catalog (C5).
Tables backed by default storage can be Delta shared to any recipient (Beta)
September 2, 2025
Delta Sharing providers can now share tables backed by default storage with any recipient, including both open and Databricks recipients—even if the recipient is using classic compute. Tables with partitioning enabled are an exception.
See Limitations.
Migrate Lakeflow Declarative Pipelines from legacy publishing mode is rolled back to Public Preview
September 2, 2025
Lakeflow Declarative Pipelines includes a legacy publishing mode that previously limited publishing to a single catalog and schema. Default publishing mode enables publishing to multiple catalogs and schemas. A feature, recently released as generally available, can help migrate from the legacy publishing mode to the default publishing mode. Due to an issue found after release, the migration feature has been rolled back to Public Preview status and functionality.
See Enable the default publishing mode in a pipeline
AI agents: Authorize on-behalf-of-user Public Preview
September 2, 2025
AI agents deployed to Model Serving endpoints can use on-behalf-of-user authorization. This lets an agent act as the Databricks user who runs the query for added security and fine-grained access to sensitive data.
SQL Server connector supports SCD type 2
September 1, 2025
The Microsoft SQL Server connector in Lakeflow Connect now supports SCD type 2. This setting, known as history tracking or slowly changing dimensions (SCD), determines how to handle changes in your data over time. With history tracking off (SCD type 1), outdated records are overwritten as they're updated and deleted in the source. With history tracking on (SCD type 2), the connector maintains a history of those changes. See Enable history tracking (SCD type 2).