Skip to main content

Databricks on AWS GovCloud release notes 2025

The following platform features, improvements, and fixes were released on Databricks on AWS GovCloud in 2025.

Workflow task repair now respects transitive dependencies

May 19, 2025

Previously, repaired tasks were unblocked once their direct dependencies completed. Now, repaired tasks wait for all transitive dependencies. For example, in a graph A → B → C, repairing A and C will block C until A finishes.

Databricks Runtime 16.4 LTS is GA

May 13, 2025

Databricks Runtime 16.4 and Databricks Runtime 16.4 ML are now generally available.

See Databricks Runtime 16.4 LTS and Databricks Runtime 16.4 LTS for Machine Learning.

Databricks JDBC driver 2.7.3

May 12, 2025

The Databricks JDBC Driver version 2.7.3 is now available for download from the JDBC driver download page.

This release includes the following enhancements and new features:

  • Added support for Azure Managed Identity OAuth 2.0 authentication. To enable this, set the Auth_Flow property to 3.
  • Added support for OAuth Token exchange for IDPs different than host. OAuth access tokens (including BYOT) will be exchanged for a Databricks access token.
  • OAuth browser (Auth_Flow=2) now supports token caching for Linux and Mac operating systems.
  • Added support for VOID, Variant, and TIMESTAMP_NTZ data types in getColumns() and getTypeInfo() APIs.
  • The driver now lists columns with unknown or unsupported types and maps them to SQL VARCHAR in the getColumns() metadata API.
  • Added support for cloud.databricks.us and cloud.databricks.mil domains when connecting to Databricks using OAuth (AuthMech=11).
  • Upgraded to netty-buffer 4.1.119 and netty-common 4.1.119 (previously 4.1.115).

This release resolves the following issues:

  • Compatibility issues when deserializing Apache Arrow data with Java JVMs version 11 or higher.
  • Issues with date and timestamp before the beginning of the Gregorian calendar when connecting to specific Spark versions with Arrow result set serialization.

For complete configuration information, see the Databricks JDBC Driver Guide installed with the driver download package.

You can now collaborate with multiple parties with Databricks Clean Rooms

May 14, 2025

Databricks Clean Rooms now supports:

  • Up to 10 collaborators for more complex, multi-party data projects.
  • New notebook approval workflow that enhances security and compliance, allowing designated runners, requiring explicit approval before execution
  • Auto-approval options for trusted partners.
  • Difference views for easy review and auditing.

These updates enable more secure, scalable, and auditable collaboration.

Databricks GitHub App adds workflow scope to support authoring GitHub Actions

May 9, 2025

Databricks made a change where you may get an email request for the Read and write access to Workflows scope for the Databricks GitHub app. This change makes the scope of the Databricks GitHub app consistent with the required scope of other supported authentication methods and allows users to commit GitHub Actions from Databricks Git folders using the Databricks GitHub app for authorization.

If you are the owner of a Databricks account where the Databricks GitHub app is installed and configured to support OAuth, you may receive the following notification in an email titled "Databricks is requesting updated permissions" from GitHub. (This is a legitimate email request from Databricks.) Accept the new permissions in order to enable committing GitHub Actions from Databricks Git folders with the Databricks GitHub app.

The email image you may receive when the Databricks GitHub app requests Workflow access

Query snippets are now available in the new SQL editor, notebooks, files, and dashboards

May 9, 2025

Query snippets are segments of queries that you can share and trigger using autocomplete. You can now create query snippets through the View menu in the new SQL editor, and also in the notebook and file editors. You can use your query snippets in the SQL editor, notebook SQL cells, SQL files, and SQL datasets in dashboards.

See Query snippets.

May 2025

The following features and updates were released on Databricks on AWS GovCloud in May 2025.

Billing and audit system tables are now available in AWS GovCloud

May 9, 2025

The system.billing schema and the system.access.audit table are now supported on AWS GovCloud. System tables provide a a Databricks-hosted analytical store of your account's operational data, accessible in the system catalog. System tables in schemas other than billing and access.audit are not available on AWS GovCloud. System tables are not available in AWS GovCloud DoD. For more information, see Monitor account activity with system tables.

You can now create views in ETL pipelines

May 8, 2025

The CREATE VIEW SQL command is now available in ETL pipelines. You can create a dynamic view of your data. See CREATE VIEW (DLT).

Configure Python syntax highlighting in Databricks notebooks

May 8, 2025

You can now configure Python syntax highlighting in notebooks by placing a pyproject.toml file in the notebook's ancestor path or your home folder. Through the pyproject.toml file, you can configure ruff, pylint, pyright, and flake8 linters, as well as disable Databricks-specific rules. This configuration is supported for clusters running Databricks Runtime 16.4 or above, or Client 3.0 or above.

See Configure Python syntax highlighting.

Jobs and pipelines now share a single, unified view (Public Preview)

May 7, 2025

You can now view all workflows, including jobs, ETL pipelines, and ingestion pipelines, in a single unified list. See View jobs and pipelines.

File events for external locations improve file notifications in Auto Loader and file arrival triggers in jobs (Public Preview)

May 5, 2025

You can now enable file events on external locations that are defined in Unity Catalog. This makes file arrival triggers in jobs and file notifications in Auto Loader more scalable and efficient.

This feature is in Public Preview. Auto Loader support for file events requires enablement by a Databricks representative. For access, reach out to your Databricks account team.

For details, see the following:

April 2025

The following features and updates were released on Databricks on AWS GovCloud in April 2025.

Deletion vectors on DLT tables now follow workspace settings

April 28, 2025

New streaming tables and materialized views will follow the workspace settings for deletion vectors. See Auto-enable deletion vectors and What are deletion vectors?.

Share streaming tables and materialized views using Delta Sharing (Public Preview)

April 23, 2025

You can now use Delta Sharing to share streaming tables and materialized views.

See Create and manage shares for Delta Sharing and Read shared streaming tables and materialized views.

Strict enforcement of row-level security and column masking policies in Delta Sharing

April 21, 2025

Delta Sharing now consistently enforces row-level security and column masking policies applied to tables a shared data asset is dependent on, whether those policies were applied before or after the data asset was shared. Recipients may experience differences in query behavior when accessing shared data that depends on tables with row-level security or column masking policies. This ensures that data access aligns with the provider's intended security controls at all times.

See Filter sensitive table data using row filters and column masks.

Run a subset of tasks within a job

April 21, 2025

You can now run a subset of the tasks when manually triggering a job. See Run a job with different settings.

Python type error highlighting

April 14, 2025

Python code in notebooks and file editors can highlight type errors for non-existent attributes, missing arguments, and mismatched arguments. See Python error highlighting.

Reference SQL output in downstream tasks of a job

April 14, 2025

You can now use dynamic values to reference the output of a SQL task in downstream tasks in the same job. For each tasks can iterate over the rows of data in the output.

See What is a dynamic value reference?.

Batch Unity Catalog Python UDFs (Public Preview)

April 14, 2025

Unity Catalog Batch Python UDFs extend the capabilities of Unity Catalog UDFs by allowing you to write Python code to operate on batches of data, significantly improving efficiency by reducing the overhead associated with row-by-row UDFs. Batch Python UDFs support service credentials to access external cloud services. See Batch Python User-defined functions (UDFs) in Unity Catalog.

Access UDF context information using TaskContext

April 14, 2025

The TaskContext PySpark API now allows you to retrieve context information—such as user identity and cluster tags—while running Batch Unity Catalog Python UDFs or PySpark UDFs. This feature lets you pass user-specific details, like identity, to authenticate external services within UDFs. See Get task context in a UDF.

The BROWSE privilege is GA

April 1, 2025

The BROWSE privilege is now generally available. The BROWSE privilege allows you to grant users, service principals, and account groups permission to view a Unity Catalog object's metadata. This enables users to discover data without having read access to the data. A user can view an object's metadata using Catalog Explorer, the schema browser, search results, the lineage graph, information_schema, and the REST API.

See BROWSE.

Was this article helpful?