September 2025 platform release notes
These features and SAP Databricks platform improvements were released in September 2025. The cloud the release applies to is indicated for each release note.
Releases are staged. Your SAP Databricks account might not be updated until a week or more after the initial release date.
Anthropic Claude Opus 4.1 now available as a Databricks-hosted foundation model
September 29, 2025 | Applies to: AWS
GCP
Azure
Mosaic AI Model Serving now supports Anthropic's Claude Opus 4.1 a Databricks-hosted foundation model. You can access this model using Foundation Model APIs pay-per-token.
Mosaic AI Agent Framework supports automatic authentication passthrough for Lakebase resources
September 23, 2025 | Applies to: AWS
Azure
Mosaic AI Agent Framework now supports automatic authentication passthrough for Lakebase resources. This requires MLFlow 3.3.2 or above.
Route-optimized endpoints now require route-optimized URL path for querying
September 22, 2025 | Applies to: AWS
GCP
Azure
All newly created route-optimized endpoints must be queried using the route-optimized URL. Queries using the standard workspace URL path are not supported for route-optimized endpoints created after September 22, 2025.
Explore table data using an LLM (Preview)
September 22, 2025 | Applies to: AWS
GCP
Azure
You can now ask natural language questions about sample data using Catalog Explorer. The Assistant uses metadata context and table usage patterns to generate a SQL query. You can then validate the query and run it against the underlying table.
Compliance support for Mosaic AI Vector Search standard endpoints
September 18, 2025 | Applies to: AWS
Mosaic AI Vector Search standard endpoints are now available in FedRAMP Moderate and IRAP workspaces.
Mount Delta shares to an existing shared catalog
September 12, 2025 | Applies to: AWS
GCP
Azure
Delta Sharing recipients can now mount shares received from their Delta Sharing provider to an existing shared catalog. Previously, recipients needed to create a new catalog for each new share.
Automatic identity management is generally available
September 10, 2025 | Applies to: Azure
Automatic identity management enables you to sync users, service principals, and groups from Microsoft Entra ID into SAP Databricks without configuring an application in Microsoft Entra ID. When enabled, you can directly search in identity federated workspaces for Microsoft Entra ID users, service principals, and groups and add them to your workspace. Databricks uses Microsoft Entra ID as the source of record, so any changes to group memberships are respected in SAP Databricks. Automatic identity management also supports nested groups.
SAP Databricks connector in Microsoft Power Platform is in Preview
September 5, 2025 | Applies to: AWS
GCP
Use SAP Databricks data to build canvas apps in Power Apps, flows in Power Automate, and agents in Copilot Studio by creating a SAP Databricks connection in Power Platform.
Databricks Online Feature Stores (Preview)
September 5, 2025 | Applies to: AWS
Azure
Databricks Online Feature Stores, powered by Lakebase, provide highly-scalable low-latency access to feature data while maintaining consistency with your offline feature tables. Native integrations with Unity Catalog, MLflow, and Mosaic AI Model Serving help you productionize your model endpoints, agents, and rule engines, so they can automatically and securely access features from Online Feature Stores while maintaining high performance.
MLflow metadata is now available in system tables (Preview)
September 5, 2025 | Applies to: AWS
GCP
Azure
MLflow metadata is now available in system tables. View metadata managed within the MLflow tracking service from the entire workspace in one central location, taking advantage of all the lakehouse tooling Databricks offers, such as building custom dashboards, SQL alerts, and large scale data analytic queries.
AI agents: Authorize on-behalf-of-user Preview
September 2, 2025 | Applies to: AWS
GCP
Azure
AI agents deployed to Model Serving endpoints can use on-behalf-of-user authorization. This lets an agent act as the Databricks user who runs the query for added security and fine-grained access to sensitive data.