February 2025
These features and Databricks platform improvements were released in February 2025.
Releases are staged. Your Databricks account might not be updated until a week or more after the initial release date.
Databricks ODBC driver 2.9.1
February 28, 2025
The Databricks ODBC Driver version 2.9.1 is now available for download from the ODBC driver download page.
This release includes the following enhancements and new features:
- Added token renewal support for the token passthrough OAuth flow.
- Added a driver-side timeout option for idle statements. A timeout can be configured to prevent an application from interrupting results fetching and hanging in the middle of a query execution.
This release resolves the following issues:
- If the initial catalog is not set, the driver now defaults to using the
hive_metastore
catalog. - The driver updates the HTTP User-Agent from
thrift/0.9.0
tothrift/0.17.0
.
This release includes upgrades to several third-party libraries:
- OpenSSL 3.0.15 (previously 3.0.13)
- libcURL 8.10.1 (previously 8.7.1)
- ICU 74.2 (previously 71.1)
- Arrow 15.0.0 (previously 9.0.0) for OSX & Linux (Windows was on 15.0.0)
- lz4 1.10.0 (previously 1.9.4)
- Expat 2.6.3 (previously 2.6.2)
- Kerberos 1.21.3 (previously 1.21.2)
For complete configuration information, see the Databricks ODBC Driver Guide installed with the driver download package.
Publish pipeline event logs as Unity Catalog tables
February 28, 2025
You can now publish event logs for DLT pipelines as tables in Unity Catalog. See Query the event log.
Databricks Assistant provides table insights and lineage
February 27, 2025
In Unity Catalog-enabled Databricks workspaces, Databricks Assistant can provide table lineage and insights when you use simple prompts.
See Get table lineage using Databricks Assistant.
Import open Delta Sharing shares directly into Databricks using the Import provider dialog
February 26, 2025
Unity Catalog-enabled Databricks workspaces include an Import provider dialog for importing shares from non-Databricks Delta Sharing servers into Unity Catalog using Open-to-Databricks (O2D) sharing. This feature was released in July 2024 but was not announced at the time. When you use it to import a provider, your users have essentially the same recipient experience as they would with Databricks-to-Databricks sharing: click Create catalog to create a shared catalog, use Unity Catalog access controls to grant access to shared tables, and use standard Unity Catalog syntax to query those shares, with no need to store a credential file or specify it when you query shared data. See Databricks: Read shared data using open sharing connectors.
Delta Sharing from open sharing providers to Databricks using OAuth
February 26, 2025
Databricks now supports OAuth Client Credentials for Open-to-Databricks (O2D) sharing, that is, from non-Databricks Delta Sharing servers to Databricks workspaces that are enabled for Unity Catalog. This makes it easier to import shares from Delta Sharing provider networks like Oracle Delta Share that use OAuth Client Credentials. This support adds to the already considerable advantages of using Unity Catalog as a recipient in the O2D model, which include the Import provider UI for importing shares and the ability to use standard Unity Catalog syntax to query those shares, with no need to store a credential file or specify it when you query shared data. See Databricks: Read shared data using open sharing connectors.
Streaming reads on Unity Catalog views is GA
February 25, 2025
Streaming reads from Unity Catalog views registered against Delta tables are now generally available. See Read a Unity Catalog view as a stream.
Event time ordering for initial snapshot stream processing is GA
February 25, 2025
The withEventTimeOrder
option for processing Structured Streaming queries from Delta table sources is now generally available. See Process initial snapshot without data being dropped.
Asynchronous progress tracking is GA
February 25, 2025
Asynchronous progress tracking for Structured Streaming is now generally available. See What is asynchronous progress tracking?.
New filter panel in Databricks Marketplace
February 25, 2025
Databricks Marketplace now includes an improved filter panel designed to streamline search and browsing. This update enhances discoverability, making it easier to search by product, provider, price, and more. See Browse Databricks Marketplace listings.
OAuth secret lifetime now configurable
February 24, 2025
OAuth secrets for service principals now have a configurable lifetime. Newly created OAuth secrets default to a maximum lifetime of two years, whereas previously, they did not expire. See Authorize unattended access to Databricks resources with a service principal using OAuth.
Preview files in volumes
February 18, 2025
Volumes now display previews for common file formats in Catalog Explorer, including images, text files, JSON, yaml, and CSV. See Preview files in volumes.
Model serving billing records are now logged at five-minute intervals
February 18, 2025
To improve cost observability, billing records are now logged for every five minutes of model serving and provisioned throughput usage. Previously, the records were logged at one-hour intervals. See Billable usage system table reference. Note that Foundation Model Api Pay-Per-Token usage is still aggregated at an hourly-interval.
Automatic liquid clustering (Public Preview)
February 18, 2025
You can now enable automatic liquid clustering on Unity Catalog managed tables. Automatic liquid clustering intelligently selects clustering keys to optimize data layout for your queries. See Automatic liquid clustering.
Databricks Runtime 14.1 series support ends
February 12, 2025
Support for Databricks Runtime 14.1 and Databricks Runtime 14.1 for Machine Learning ended on November 5. See Databricks support lifecycles.
Lakehouse Monitoring is Public Preview
February 11, 2025
Databricks Lakehouse Monitoring integrates data and model quality monitoring into the Databricks platform. It provides built-in quality metrics such as summary statistics, changes in distribution over time, and model performance including fairness and bias. Lakehouse Monitoring automatically generates a summary dashboard to help you visualize data quality over time. You can also set up alerts based on the data.
Databricks Runtime 16.2 is GA
February 10, 2025
Databricks Runtime 16.2 and Databricks Runtime 16.2 ML are now generally available.
See Databricks Runtime 16.2 and Databricks Runtime 16.2 for Machine Learning.
Support of Scala streaming foreach
, foreachBatch
, and flatMapGroupsWithState
on standard access mode compute (formerly shared access mode)
February 7, 2025
Standard access mode compute now supports the Scala streaming function DataStreamWriter.foreach
on Databricks Runtime 16.1 and above. On Databricks Runtime 16.2 and above, the functions DataStreamWriter.foreachBatch
and KeyValueGroupedDataset.flatMapGroupsWithState
are supported.
Use service credentials for Unity Catalog-governed access to external cloud services
February 7, 2025
Service credentials enable simple and secure authentication with your Google Cloud services from Databricks. Service credentials are generally available and support both Python and Scala SDKs. This feature requires compute on Databricks Runtime 16.2 or above. See Manage access to external cloud services using service credentials.
Download as Excel is now supported in notebooks connected to SQL warehouses
February 6, 2025
For notebooks connected to SQL warehouses, you can now download cell results that contain tabular data as an Excel file. See Download results.
Write data from pipelines to external services with DLT sinks (Public Preview)
February 5, 2025
The DLT sink
API is in Public Preview. With DLT sinks, you can write data transformed by your pipeline to targets like event streaming services such as Apache Kafka or Azure Event Hubs, and external tables managed by Unity Catalog or the Hive metastore. See Stream records to external services with DLT sinks.
Predictive optimization
February 4, 2025
Predictive optimization removes the need to manually manage maintenance operations for Delta tables. Maintenance operations are only run as necessary, eliminating unnecessary runs for maintenance operations and the burden associated with tracking and troubleshooting performance. See Predictive optimization for Unity Catalog managed tables.