June 2025
These features and Databricks platform improvements were released in June 2025.
Releases are staged. Your Databricks account might not be updated until a week or more after the initial release date.
Configure a pool's availability zone using the Databricks UI
June 19, 2025
You can now set the availability zone of an instance pool in the Databricks UI. Previously, this option was only available through the API. See Configure the availability zone.
AI documentation for Unity Catalog tables and columns now uses the same model as Databricks Assistant
June 18, 2025
AI documentation for Unity Catalog tables and table columns now uses a Databricks-hosted model:
- If you already use a Databricks-hosted model for Assistant, there is no change.
- If you turn off Partner-powered AI assistive features, AI documentation now stays available and uses a Databricks-hosted model.
For more information, see:
- Features governed by the Partner-powered AI assistive features setting
- Add AI-generated comments to Unity Catalog objects
Jobs & Pipelines in the left navigation menu
June 18, 2025
The Jobs & Pipelines item in the left navigation is the entry point to the Databricks' unified data engineering features, Lakeflow. The Pipelines and Workflows items in the left navigation have been removed, and their functionality is now available from Jobs & Pipelines.
Moving streaming tables and materialized views between pipelines is in Public Preview
June 17, 2025
Tables created by Lakeflow Declarative Pipelines in Unity Catalog ETL pipelines can be moved from one pipeline to another. See Move Lakeflow Declarative Pipelines tables between pipelines.
Attribute-based access control (ABAC) in Unity Catalog is in Beta
June 12, 2025
Databricks now supports attribute-based access control (ABAC) in Unity Catalog, enabling dynamic, tag-driven access policies across catalogs, schemas, and tables. ABAC uses tags and user-defined functions (UDFs) to enforce fine-grained access controls based on data attributes such as sensitivity, region, or business domain.
Using ABAC, you can define scalable policies once and apply them across large sets of data assets. Policies inherit across the object hierarchy and can include row-level filters or column masking logic. This simplifies governance, supports centralized policy management, and improves security posture. See Unity Catalog attribute-based access control (ABAC).
Automatic liquid clustering is now GA
June 12, 2025
Automatic liquid clustering is now generally available. You can enable automatic liquid clustering on Unity Catalog managed tables. Automatic liquid clustering intelligently selects clustering keys to optimize data layout for your queries. See Automatic liquid clustering.
Monitor and revoke personal access tokens in your account (GA)
June 11, 2025
The token report page enables account admins to monitor and revoke personal access tokens (PATs) in the account console. Databricks recommends you use OAuth access tokens instead of PATs for greater security and convenience. See Monitor and revoke personal access tokens in the account.
Microsoft SharePoint connector (Beta)
June 11, 2025
The fully-managed Microsoft SharePoint connector in Lakeflow Connect allows you to ingest data from SharePoint into Databricks. See Configure OAuth U2M for Microsoft SharePoint ingestion.
The Beta release supports API-based pipeline creation. UI-based pipeline creation is not yet supported.
AI Gateway is now generally available
June 11, 2025
Mosaic AI Gateway is now generally available. AI Gateway is a centralized service that streamlines the usage and management of generative AI models within an organization.
AI Gateway brings governance, monitoring, and production readiness to model serving endpoints using the following features:
- Permission and rate limiting to control who has access and how much access.
- Payload logging to monitor and audit data being sent to model APIs using inference tables.
- Usage tracking to monitor operational usage on endpoints and associated costs using system tables.
- Traffic routing to load balance traffic across multiple models.
- Fallbacks for external models to minimize production outages during and after deployment.
AI Guardrails remains in Public Preview.
AUTO CDC APIs replace APPLY CHANGES
June 11, 2025
The new AUTO CDC
APIs create flows that support change data feeds (CDF) in Lakeflow Declarative Pipelines. Databricks recommends replacing usage of APPLY CHANGES
APIs with AUTO CDC
.
For information about the SQL AUTO CDC
API, see:
For information about the Python create_auto_cdc_flow
APIs, see
Databricks Jobs is now Lakeflow Jobs
June 11, 2025
The product known as Databricks Jobs is now Lakeflow Jobs. No migration is required to use Lakeflow Jobs. See Lakeflow Jobs.
DLT is now Lakeflow Declarative Pipelines
June 11, 2025
The product known as DLT is now Lakeflow Declarative Pipelines. No migration is required to use Lakeflow Declarative Pipelines. See Lakeflow Declarative Pipelines.
Managed Apache Iceberg tables are in Public Preview
June 11, 2025
Managed Apache Iceberg tables are now in Public Preview. You can read from and write to these tables from Databricks or from external Iceberg engines using the Iceberg REST Catalog API. These tables are integrated with predictive optimization to apply advanced optimizations, including liquid clustering automatically. See What is Apache Iceberg in Databricks? and Unity Catalog managed tables in Databricks for Delta Lake and Apache Iceberg.
Foreign Apache Iceberg tables are in Public Preview
June 11, 2025
Foreign Apache Iceberg tables are now in Public Preview. You can read Iceberg tables managed by foreign catalogs, such as HMS, Glue, and Snowflake Horizon, using Lakehouse Federation. These tables support Unity Catalog advanced features such as fine-grained access controls, lineage, and auditing. See What is Apache Iceberg in Databricks? and Work with foreign tables.
Convert to Unity Catalog managed table from external table is in Public Preview
June 11, 2025
ALTER TABLE ... SET MANAGED
is now available in Public Preview for participating customers. This command enables seamless conversion of Unity Catalog external tables to managed tables. It allows you to take full advantage of Unity Catalog managed table features, such as enhanced governance, reliability, and performance. See Convert to Unity Catalog managed table from external table.
MLflow 3.0 is generally available
June 10, 2025
MLflow 3.0 is now generally available.
MLflow 3.0 on Databricks delivers state-of-the-art experiment tracking, observability, and performance evaluation for machine learning models, generative AI applications, and agents on the Databricks Lakehouse. See Get started with MLflow 3.
Deployment jobs (Public Preview)
June 10, 2025
Deployment jobs are now available in Public Preview.
Deployment jobs allow you to automate tasks like evaluation, approval, and deployment whenever a new model version is created, integrating seamlessly with Unity Catalog models and Lakeflow Jobs. See MLflow 3 deployment jobs.
Serverless performance targets is now GA
June 10, 2025
Selecting the serverless performance setting for jobs and pipelines is now generally available.
When the Performance optimized setting is enabled, your workload is optimized for faster startup and execution time. When disabled, the serverless workload runs on standard performance mode, which is optimized for cost and has a slightly higher launch latency.
For more information, see Select a performance mode and Select a performance mode.
Model Context Protocol (MCP) for AI agents is in Beta
June 10, 2025
Databricks now supports MCP, an open standard that lets AI agents securely access tools, resources, prompt, and other contextual information using a consistent interface.
- Managed MCP servers: Use Databricks-hosted servers for easy, no-maintenance access to Unity Catalog data and tools.
- Custom MCP servers: Host your own MCP server or third-party server as a Databricks app.
See Model context protocol (MCP) on Databricks.
Cross-platform view sharing is now GA
June 9, 2025
Cross-platform view sharing via Delta Sharing is now generally available. The data access and billing method when sharing views are updated. See How do I incur and check Delta Sharing costs?.
A new system table allows you to track the shared materialized data history. See Delta Sharing materialization history system table reference.
Account admins can now configure the time-to-live (TTL) of data materialization. See Configure TTL of data materialization.
Tag policies is in Beta
June 9, 2025
Tag policies enables admins to enforce consistent tagging across data assets such as catalogs, schemas, and tables. Tag policies define allowed tag keys and values, and control which users and groups can assign them. This enables standardized metadata management for data classification, cost tracking, access control, and automation use cases.
Tag policies are managed at the account level and apply across all workspaces. For more information, see Tag policies
New consumer entitlement is generally available
June 5, 2025
Workspace admins can now grant consumer access as an entitlement to users, service principals, and groups. This allows for more fine-grained control over what users can do in a Databricks workspace. Key details:
-
Consumer access enables limited workspace UI access, querying SQL warehouses using BI tools, and viewing dashboards with embedded or viewer credentials.
-
Useful for business users who need access to shared content and dashboards but not to author or manage workspace objects.
-
This entitlement is more restrictive than workspace access or Databricks SQL access. To assign it independently, remove broader entitlements from the
users
group and configure them per user or group.
See Manage entitlements.
Salesforce Data Cloud File Sharing connector (Public Preview)
June 4, 2025
The Salesforce Data Cloud File Sharing connector offers an alternative zero-copy solution for querying Salesforce Data Cloud. When you use file federation instead of query federation, Databricks calls Salesforce Data-as-a-Service (DaaS) APIs to read data in the underlying cloud object storage location directly. Queries are run on Databricks compute without using JDBC.
Compared to query federation, file federation is ideal for federating a large amount of data. It offers improved performance for reading files from multiple data sources and better pushdown capabilities.
For more information, see Lakehouse Federation for Salesforce Data Cloud File Sharing.
Corrected job_name
values in system.billing.usage
June 3, 2025
The usage_metadata.job_name
value in the system.billing.usage
table now correctly contains job names. Previously, this value was populated with task keys instead of the user-provided job names. This change does not apply to one-time job runs, which continue to be logged with the task key.
See Billable usage system table reference.
Mosaic AI Vector Search storage-optimized endpoints are Public Preview
June 3, 2025
Mosaic AI Vector Search now offers the option of storage-optimized endpoints. Storage-optimized endpoints have a larger capacity (over one billion vectors at dimension 768) and provide 10-20x faster indexing. Pricing is optimized for the larger number of vectors. For details, see Vector Search endpoint options.
History sharing now enabled by default to improve table read performance for Databricks-to-Databricks Delta Sharing (GA)
June 3, 2025
History sharing is enabled by default (for Databricks Runtime 16.2 and above) to improve table read performance for Databricks-to-Databricks Delta Sharing. See Improve table read performance with history sharing.
Unity Catalog HTTP connections support OAuth User-to-Machine Per User credentials (Public Preview)
June 2, 2025
Individual users can now use their own OAuth credentials to sign into external services when using a Unity Catalog HTTP connection. Previously, the only OAuth option was to share a single OAuth credential for the connection regardless of the user.
See Connect to external HTTP services.
Enhanced Security and Compliance Public Preview
June 1, 2025
Enhanced Security and Compliance is a platform add-on that provides enhanced security and controls for your compliance needs. See the pricing page. The Enhanced Security and Compliance add-on includes:
-
Compliance security profile: The compliance security profile provides additional security monitoring, a hardened host OS image, and enforces automatic cluster update to ensure that your clusters are running with the latest security updates. When you enable the compliance security profile, you can choose to enable HIPAA as a compliance standard. See Compliance security profile.
-
Enhanced security monitoring: Enhanced security monitoring provides the workspace with an enhanced hardened host OS image and additional security monitoring agents to improve your threat detection capabilities. See Enhanced security monitoring.
-
Automatic cluster update: Automatic cluster update ensures that all the clusters in a workspace are periodically updated to the latest host OS image and security updates. See Automatic cluster update.
See Compliance.
New Enterprise pricing tier
June 1, 2025
The new Enterprise tier is tailored for organizations with advanced security and compliance needs. As part of this change, existing features like Private Service Connect and customer-managed keys are now available only in the Enterprise tier. These updates support a consistent, enterprise-grade experience across all supported cloud platforms and ensure that enhanced security and compliance capabilities are aligned with the requirements of customers operating in highly regulated environments. See the Platform Tiers and Add-Ons.