October 2025 platform release notes
These features and SAP Databricks platform improvements were released in October 2025. The cloud the release applies to is indicated for each release note.
Releases are staged. Your SAP Databricks account might not be updated until a week or more after the initial release date.
Skip cells when running notebooks
October 31, 2025 | Applies to: AWS
GCP
Azure
You can now skip individual cells when running multiple cells in a notebook using the %skip magic command. Add %skip at the beginning of any cell you want to skip.
Improved notebook debugger experience
October 31, 2025 | Applies to: AWS
GCP
Azure
The Python notebook interactive debugger now supports multi-file debugging. You can set breakpoints and step into functions across multiple workspace files. The debugger automatically opens the file in a new tab when you step into it. This improvement makes it easier to debug code that spans multiple files in your workspace.
Feedback model deprecated for AI agents
October 29, 2025 | Applies to: AWS
GCP
Azure
The experimental feedback model for AI agents has been deprecated. Starting November 1, 2025, newly deployed agents won't include a feedback model. Upgrade to MLflow 3 and use the log_feedback API to collect assessments on agent traces.
Request logs and assessment logs tables deprecated
October 29, 2025 | Applies to: AWS
GCP
Azure
The payload_request_logs and payload_assessment_logs tables are deprecated. Starting November 1, 2025, newly deployed agents won't have these tables. Starting November 15, 2025, existing tables won't be populated with new data. Upgrade to MLflow 3 for real-time tracing or use the provided views.
Databricks JDBC Driver 2.7.5
October 23, 2025 | Applies to: AWS
GCP
Azure
Databricks JDBC Driver (Simba) version 2.7.5 is now available with the following improvements:
New features
The connector now supports Kerberos with proxy connections. To enable Kerberos proxy, set UseProxy=1 and ProxyAuth=2. To set proxy details, use ProxyHost, ProxyPort, ProxyKrbRealm, ProxyKrbFQDN, and ProxyKrbService.
Resolved issues
- Fixed an issue where the connector failed to run complex queries that contained
?characters in native mode. - Fixed intermittent failures in Unity Catalog volume ingestion caused by unexpected connector behavior.
- Fixed an assertion error in
getColumnswhen a table included a column of typeVoidorVariantand thejava -eaflag was enabled.
ai_parse_document (Preview)
October 23, 2025 | Applies to: AWS
GCP
Azure
ai_parse_document is now available in Preview.
Column drop behavior updated
October 22, 2025 | Applies to: AWS
GCP
Azure
When you attempt to drop a column that has one or more governed tags assigned, the operation now fails. To drop a tagged column, you must first remove all governed tags from it.
Zstd is now the default compression for new Delta tables
October 21, 2025 | Applies to: AWS
GCP
Azure
All newly created Delta tables in Databricks Runtime 16.0 and above now use Zstandard (Zstd) compression by default instead of Snappy.
Existing tables continue to use their current compression codec. To change the compression codec for an existing table, set the delta.parquet.compression.codec table property.
Databricks Asset Bundles in the workspace is GA
October 16, 2025 | Applies to: AWS
GCP
Azure
Databricks Asset Bundles in the workspace is now generally available (GA). This feature allows you to collaborate with other users in your organization to edit, commit, test, and deploy bundle updates through the UI.
Improved autoscaling behavior for Mosaic AI Model Serving
October 13, 2025 | Applies to: AWS
GCP
Azure
Autoscaling in Mosaic AI Model Serving has been tuned to ignore extremely brief traffic surges and instead respond only to sustained increases in load. This change prevents unnecessary provisioned concurrency scaling during momentary bursts and reduces serving costs without impacting performance or reliability.
Data Classification (Preview)
October 13, 2025 | Applies to: AWS
GCP
Azure
Databricks Data Classification is now in Preview and supports all catalog types, consolidates all classification results into a single system table, and a new UI to review and auto-tag classifications.
Multimodal support is now available
October 13, 2025 | Applies to: AWS
GCP
Azure
Mosaic AI Model Serving now supports multimodal inputs for Databricks hosted foundation models.
This multimodal support is available using the following functionalities:
- Foundation Model APIs pay-per-token.
- Foundation Model APIs provisioned throughput.
- AI Functions. Both real-time inference and batch inference workloads.
Serverless private git (Preview)
October 6, 2025 | Applies to: Azure
You can now connect a Databricks workspace to a private Git server using serverless compute and Azure Private Link.
Partition metadata is generally available
October 6, 2025 | Applies to: AWS
GCP
Azure
You can now enable partition metadata logging, a partition discovery strategy for external tables registered to Unity Catalog.
Delta Sharing recipients can apply row filters and column masks (GA)
October 6, 2025 | Applies to: AWS
GCP
Azure
Delta Sharing recipients can now apply their own row filters and columns masks on shared tables and shared foreign tables. However, Delta Sharing providers still cannot share data assets that have row-level security or column masks.
Certification status system tag is in Preview
October 6, 2025 | Applies to: AWS
GCP
Azure
You can now apply the system.certification_status governed tag to catalogs, schemas, tables, views, volumes, dashboards, registered models, and Genie Spaces to indicate whether a data asset is certified or deprecated. This improves governance, discoverability, and trust in analytics and AI workloads.
Prompt caching is now supported for Claude models
October 3, 2025 | Applies to: AWS
GCP
Azure
Prompt caching is now supported for Databricks-hosted Claude models. You can specify the cache_control parameter in your query requests to cache the following:
- Thinking messages content in the
messages.contentarray. - Images content blocks in the
messages.contentarray. - Tool use, results and definitions in the
toolsarray.
Anthropic Claude Sonnet 4.5 now available as a Databricks-hosted model
October 3, 2025 | Applies to: AWS
GCP
Azure
Mosaic AI Model Serving now supports Anthropic Claude Sonnet 4.5 as a Databricks-hosted model. You can access this model using Foundation Model APIs pay-per-token.
Notebook improvements
October 3, 2025 | Applies to: AWS
GCP
Azure
The following notebook improvements are now available:
- The cell execution minimap now appears in the right margin of notebooks. Use the minimap to get a visual overview of your notebook's run status and quickly navigate between cells.
- Use Databricks Assistant to help diagnose and fix environment errors, including library installation errors.
- When reconnecting to serverless notebooks, sessions are automatically restored with the notebook's Python variables and Spark state.
- Pyspark authoring completion now supports
agg,withColumns,withColumnsRenamed, andfilter/whereclauses. - Databricks now supports importing and exporting IPYNB notebooks up to 100 MB. Revision snapshot autosaving, manual saving, and cloning are supported for all notebooks up to 100 MB.
- When cloning and exporting notebooks, you can now choose whether to include cell outputs or not.
Anthropic Claude Sonnet 4 is available for batch inference in US regions
October 3, 2025 | Applies to: AWS
GCP
Azure
Mosaic AI Model Serving now supports Anthropic Claude Sonnet 4 for batch inference workflows. You can now use databricks-claude-sonnet-4 in your ai_query requests to perform batch inference.
Convert to Unity Catalog managed table from external table
October 2, 2025 | Applies to: AWS
GCP
Azure
The ALTER TABLE ... SET MANAGED command is now generally available. This command seamlessly converts Unity Catalog external tables to managed tables. It allows you to take full advantage of Unity Catalog managed table features, such as enhanced governance, reliability, and performance.
Git email identity configuration for Git folders
October 1, 2025 | Applies to: AWS
GCP
Azure
You can now specify a Git provider email address, separate from your username, when creating Git credentials for Databricks Git folders. This email is used as the Git author and committer identity for all commits made through Git folders, ensuring proper attribution in your Git provider and better integration with your Git account.
The email you provide becomes the GIT_AUTHOR_EMAIL and GIT_COMMITTER_EMAIL for commits, allowing Git providers to properly associate commits with your user account and display your profile information. If no email is specified, Databricks uses your Git username as the email address (legacy behavior).
New permissions for the Databricks GitHub App
October 1, 2025 | Applies to: AWS
GCP
Azure
If you own a SAP Databricks account with the SAP Databricks GitHub app installed, you may receive an email titled "Databricks is requesting updated permissions" from GitHub.
This is a legitimate request from Databricks. It asks you to approve a new permission that allows SAP Databricks to read your GitHub account email(s). Granting this permission will let SAP Databricks retrieve and save your primary GitHub account email to your Linked Git credential in SAP Databricks. In an upcoming feature, this will ensure that commits made from SAP Databricks are properly linked to your GitHub identity.
If you don't accept the new permission, your Linked Git credential will still authenticate with GitHub. However, future commits from this credential will not be associated with your GitHub account identity