Pular para o conteúdo principal

June 2025

These features and Databricks platform improvements were released in June 2025.

nota

Releases are staged. Your Databricks account might not be updated until a week or more after the initial release date.

OIDC federation for Databricks-to-open Delta Sharing is generally available

June 24, 2025

Using Open ID Connect (OIDC) federation for Delta Sharing, where recipients use JSON web tokens (JWT) from their own IdP as short-lived OAuth tokens for secure, federated authentication, is generally available.

See Use Open ID Connect (OIDC) federation to enable authentication to Delta Sharing shares (open sharing).

Migrate Lakeflow Declarative Pipelines pipelines from legacy publishing mode

June 23, 2025

Lakeflow Declarative Pipelines has a legacy publishing mode that only allowed publishing to a single catalog and schema. The default publishing mode enables publishing to multiple catalogs and schemas. Migration from the legacy publishing mode to the default publishing mode is now available.

See Enable the default publishing mode in a pipeline.

Custom rate limits for model serving endpoints

June 23, 2025

You can now specify custom rate limits for your model serving endpoints using AI Gateway. Provide rate limits for any of the following:

  • Endpoint: Specify the overall rate limit for all traffic passing through the endpoint, regardless of individual or user group limits
  • User: Specify the rate limit for all users of the endpoint.
  • Specific user: Specify the rate limit for a particular user of the endpoint.
  • User-group: Specify the rate limit for a particular user-group. This limit is shared across all members of the group.
  • Service principal: Specify the rate limit for the number of requests per minute from a particular service principal.

See Configure AI Gateway using the UI.

Sharing managed Iceberg tables in Delta Sharing is in Public Preview

June 23, 2025

You can now use Delta Sharing to share managed Iceberg tables in Databricks-to-Databricks sharing and open sharing. See Add managed Iceberg tables to a share and Read shared managed Iceberg tables.

Expanded feature support for FIPS-Compliant environments

June 20, 2025

Model Serving CPU and GPU workloads, Foundation Model API provisioned throughput endpoints (excluding LLaMA 4), Mosaic AI Gateway, and external models are now available in FIPS-compliant environments. This includes support for the following compliance standards offered through the compliance security profile:

  • FedRAMP Moderate
  • IRAP
  • Canada Protected B

See Compliance security profile standards: CPU and GPU workloads for region availability of these standards.

AI documentation for Unity Catalog tables and columns now uses the same model as Databricks Assistant

June 18, 2025

AI documentation for Unity Catalog tables and table columns now uses a Databricks-hosted model:

  • If you already use a Databricks-hosted model for Assistant, there is no change.
  • If you turn off Partner-powered AI assistive features, AI documentation now stays available and uses a Databricks-hosted model.

For more information, see:

Jobs & Pipelines in the left navigation menu

June 18, 2025

The Jobs & Pipelines item in the left navigation is the entry point to the Databricks' unified data engineering features, Lakeflow. The Pipelines and Workflows items in the left navigation have been removed, and their functionality is now available from Jobs & Pipelines.

Moving streaming tables and materialized views between pipelines is in Public Preview

June 17, 2025

Tables created by Lakeflow Declarative Pipelines in Unity Catalog ETL pipelines can be moved from one pipeline to another. See Move tables between Lakeflow Declarative Pipelines.

Attribute-based access control (ABAC) in Unity Catalog is in Beta

June 12, 2025

Databricks now supports attribute-based access control (ABAC) in Unity Catalog, enabling dynamic, tag-driven access policies across catalogs, schemas, and tables. ABAC uses tags and user-defined functions (UDFs) to enforce fine-grained access controls based on data attributes such as sensitivity, region, or business domain.

Using ABAC, you can define scalable policies once and apply them across large sets of data assets. Policies inherit across the object hierarchy and can include row-level filters or column masking logic. This simplifies governance, supports centralized policy management, and improves security posture. See Unity Catalog attribute-based access control (ABAC).

Automatic liquid clustering is now GA

June 12, 2025

Automatic liquid clustering is now generally available. You can enable automatic liquid clustering on Unity Catalog managed tables. Automatic liquid clustering intelligently selects clustering keys to optimize data layout for your queries. See Automatic liquid clustering.

Lakebase, a managed PostgreSQL OLTP database, is in Public Preview

June 11, 2025

Lakebase is an online transactional processing (OLTP) engine that is fully integrated with the Databricks Data Intelligence Platform. You can create a database instance, a new compute type, which provides dedicated, PostgreSQL-compatible storage and compute, enabling you to run transactional workloads alongside your Lakehouse data.

See What is Lakebase?.

Monitor and revoke personal access tokens in your account (GA)

June 11, 2025

The token report page enables account admins to monitor and revoke personal access tokens (PATs) in the account console. Databricks recommends you use OAuth access tokens instead of PATs for greater security and convenience. See Monitor and revoke personal access tokens in the account.

Microsoft SharePoint connector (Beta)

June 11, 2025

The fully-managed Microsoft SharePoint connector in Lakeflow Connect allows you to ingest data from SharePoint into Databricks. See Configure OAuth U2M for Microsoft SharePoint ingestion.

The Beta release supports API-based pipeline creation. UI-based pipeline creation is not yet supported.

AI Gateway is now generally available

June 11, 2025

Mosaic AI Gateway is now generally available. AI Gateway is a centralized service that streamlines the usage and management of generative AI models within an organization.

AI Gateway brings governance, monitoring, and production readiness to model serving endpoints using the following features:

  • Permission and rate limiting to control who has access and how much access.
  • Payload logging to monitor and audit data being sent to model APIs using inference tables.
  • Usage tracking to monitor operational usage on endpoints and associated costs using system tables.
  • Traffic routing to load balance traffic across multiple models.
  • Fallbacks for external models to minimize production outages during and after deployment.
nota

AI Guardrails remains in Public Preview.

AUTO CDC APIs replace APPLY CHANGES

June 11, 2025

The new AUTO CDC APIs create flows that support change data feeds (CDF) in Lakeflow Declarative Pipelines. Databricks recommends replacing usage of APPLY CHANGES APIs with AUTO CDC.

For information about the SQL AUTO CDC API, see:

For information about the Python create_auto_cdc_flow APIs, see

Databricks Jobs is now Lakeflow Jobs

June 11, 2025

The product known as Databricks Jobs is now Lakeflow Jobs. No migration is required to use Lakeflow Jobs. See Lakeflow Jobs.

DLT is now Lakeflow Declarative Pipelines

June 11, 2025

The product known as DLT is now Lakeflow Declarative Pipelines. No migration is required to use Lakeflow Declarative Pipelines. See Lakeflow Declarative Pipelines.

Managed Apache Iceberg tables are in Public Preview

June 11, 2025

Managed Apache Iceberg tables are now in Public Preview. You can read from and write to these tables from Databricks or from external Iceberg engines using the Iceberg REST Catalog API. These tables are integrated with predictive optimization to apply advanced optimizations, including liquid clustering automatically. See What is Apache Iceberg in Databricks? and Unity Catalog managed tables in Databricks for Delta Lake and Apache Iceberg.

Foreign Apache Iceberg tables are in Public Preview

June 11, 2025

Foreign Apache Iceberg tables are now in Public Preview. You can read Iceberg tables managed by foreign catalogs, such as HMS, Glue, and Snowflake Horizon, using Lakehouse Federation. These tables support Unity Catalog advanced features such as fine-grained access controls, lineage, and auditing. See What is Apache Iceberg in Databricks? and Work with foreign tables.

Convert to Unity Catalog managed table from external table is in Public Preview

June 11, 2025

ALTER TABLE ... SET MANAGED is now available in Public Preview for participating customers. This command enables seamless conversion of Unity Catalog external tables to managed tables. It allows you to take full advantage of Unity Catalog managed table features, such as enhanced governance, reliability, and performance. See Convert to Unity Catalog managed table from external table.

MLflow 3.0 is generally available

June 10, 2025

MLflow 3.0 is now generally available.

MLflow 3.0 on Databricks delivers state-of-the-art experiment tracking, observability, and performance evaluation for machine learning models, generative AI applications, and agents on the Databricks Lakehouse. See Get started with MLflow 3.

Deployment jobs (Public Preview)

June 10, 2025

Deployment jobs are now available in Public Preview.

Deployment jobs allow you to automate tasks like evaluation, approval, and deployment whenever a new model version is created, integrating seamlessly with Unity Catalog models and Lakeflow Jobs. See MLflow 3 deployment jobs.

Serverless performance targets is now GA

June 10, 2025

Selecting the serverless performance setting for jobs and pipelines is now generally available.

When the Performance optimized setting is enabled, your workload is optimized for faster startup and execution time. When disabled, the serverless workload runs on standard performance mode, which is optimized for cost and has a slightly higher launch latency.

For more information, see Select a performance mode and Select a performance mode.

Model Context Protocol (MCP) for AI agents is in Beta

June 10, 2025

Databricks now supports MCP, an open standard that lets AI agents securely access tools, resources, prompt, and other contextual information using a consistent interface.

  • Managed MCP servers: Use Databricks-hosted servers for easy, no-maintenance access to Unity Catalog data and tools.
  • Custom MCP servers: Host your own MCP server or third-party server as a Databricks app.

See Model context protocol (MCP) on Databricks.

Cross-platform view sharing is now GA

June 9, 2025

Cross-platform view sharing via Delta Sharing is now generally available. The data access and billing method when sharing views are updated. See How do I incur and check Delta Sharing costs?.

A new system table allows you to track the shared materialized data history. See Delta Sharing materialization history system table reference.

Account admins can now configure the time-to-live (TTL) of data materialization. See Configure TTL of data materialization.

Manage network policies for serverless egress control (Generally Available)

June 9, 2025

You can configure and enforce outbound network policies for serverless compute resources, including SQL warehouses and model serving endpoints.

With network policies you can:

  • Configure outbound access for serverless workloads.
  • Allowlist specific domains and storage accounts for restricted mode.
  • Enable dry-run mode to monitor policy impact before enforcement.
  • View and analyze denial logs in Unity Catalog for auditing and troubleshooting.

See Manage network policies for serverless egress control

Tag policies is in Beta

June 9, 2025

Tag policies enables admins to enforce consistent tagging across data assets such as catalogs, schemas, and tables. Tag policies define allowed tag keys and values, and control which users and groups can assign them. This enables standardized metadata management for data classification, cost tracking, access control, and automation use cases.

Tag policies are managed at the account level and apply across all workspaces. For more information, see Tag policies

June 9, 2025

You can now enable AWS PrivateLink to your S3 storage buckets from serverless compute. See Configure private connectivity to AWS S3 storage buckets.

June 6, 2025

You can now enable AWS PrivateLink to resources in your virtual private cloud (VPC) via a network load balancer (NLB) from serverless compute. See Configure private connectivity to resources in your VPC.

Serverless GPU compute is in Beta

June 6, 2025

Serverless GPU compute is now part of the Serverless compute offering. Serverless GPU compute is specialized for custom single and multi-node deep learning workloads. You can use serverless GPU compute to train and fine-tune custom models using your favorite frameworks and get state-of-the-art efficiency, performance, and quality.

See Serverless GPU compute.

New consumer entitlement is generally available

June 5, 2025

Workspace admins can now grant consumer access as an entitlement to users, service principals, and groups. This allows for more fine-grained control over what users can do in a Databricks workspace. Key details:

  • Consumer access enables limited workspace UI access, querying SQL warehouses using BI tools, and viewing dashboards with embedded or viewer credentials.

  • Useful for business users who need access to shared content and dashboards but not to author or manage workspace objects.

  • This entitlement is more restrictive than workspace access or Databricks SQL access. To assign it independently, remove broader entitlements from the users group and configure them per user or group.

See Manage entitlements.

New region: AWS Asia Pacific (Jakarta) (ap-southeast-3)

June 5, 2025

Databricks is now available in the AWS Asia Pacific (Jakarta) region (ap-southeast-3). Customers can now create new workspaces and use Databricks products in this region. This expansion allows organizations operating in or near Indonesia to deploy Databricks closer to their data and users, improving performance and meeting regional compliance requirements.

For the full list of supported regions, see Databricks clouds and regions.

Salesforce Data Cloud File Sharing connector (Public Preview)

June 4, 2025

The Salesforce Data Cloud File Sharing connector offers an alternative zero-copy solution for querying Salesforce Data Cloud. When you use file federation instead of query federation, Databricks calls Salesforce Data-as-a-Service (DaaS) APIs to read data in the underlying cloud object storage location directly. Queries are run on Databricks compute without using JDBC.

Compared to query federation, file federation is ideal for federating a large amount of data. It offers improved performance for reading files from multiple data sources and better pushdown capabilities.

For more information, see Lakehouse Federation for Salesforce Data Cloud File Sharing.

Corrected job_name values in system.billing.usage

June 3, 2025

The usage_metadata.job_name value in the system.billing.usage table now correctly contains job names. Previously, this value was populated with task keys instead of the user-provided job names. This change does not apply to one-time job runs, which continue to be logged with the task key.

See Billable usage system table reference.

Mosaic AI Vector Search storage-optimized endpoints are Public Preview

June 3, 2025

Mosaic AI Vector Search now offers the option of storage-optimized endpoints. Storage-optimized endpoints have a larger capacity (over one billion vectors at dimension 768) and provide 10-20x faster indexing. Pricing is optimized for the larger number of vectors. For details, see Vector Search endpoint options.

History sharing now enabled by default to improve table read performance for Databricks-to-Databricks Delta Sharing (GA)

June 3, 2025

History sharing is enabled by default (for Databricks Runtime 16.2 and above) to improve table read performance for Databricks-to-Databricks Delta Sharing. See Improve table read performance with history sharing.

Unity Catalog HTTP connections support OAuth User-to-Machine Per User credentials (Public Preview)

June 2, 2025

Individual users can now use their own OAuth credentials to sign into external services when using a Unity Catalog HTTP connection. Previously, the only OAuth option was to share a single OAuth credential for the connection regardless of the user.

See Connect to external HTTP services.