April 2020

These features and Databricks platform improvements were released in April 2020.

Note

Releases are staged. Your Databricks account may not be updated until up to a week after the initial release date.

MLflow tracking UI enhancement

April 23-30, 2020: Version 3.18

The MLflow UI now offers an option to delete child runs when you delete a root run.

Notebook usability improvements

April 23-30, 2020: Version 3.18

This release brings several usability improvements when working with notebooks:

  • You can now select adjacent notebook cells using Shift + Up or Down for the previous and next cell respectively. Multi-selected cells can be copied, cut, deleted, and pasted.

  • When you delete a cell, by default a delete confirmation dialog appears. You can now disable the confirmation dialog when you delete the cell by selecting the Do not show this again checkbox and clicking Confirm. You can also toggle the confirmation dialog with the Turn on command delete confirmation option in User Settings Icon > User Settings > Developer.

Databricks Connect now supports Databricks Runtime 6.5

April 20, 2020

Databricks Connect now supports Databricks Runtime 6.5.

Databricks Runtime 6.1 and 6.1 ML support ends

April 16, 2020

Support for Databricks Runtime 6.1 and Databricks Runtime 6.1 for Machine Learning ended on April 16. See Databricks support lifecycles.

Databricks Runtime 6.5 GA

April 14, 2020

Databricks Runtime 6.5 brings many library upgrades and new features, including:

  • Operation metrics for all writes, updates, and deletes on a Delta table now appear in table history

  • You can rate-limit the data processed in Delta Lake streaming micro-batches

  • Snowflake connector is updated to 2.5.9

For more information, see the complete Databricks Runtime 6.5 (EoS) release notes.

Databricks Runtime 6.5 for Machine Learning GA

April 14, 2020

Databricks Runtime 6.5 ML brings the following library upgrade:

  • MLflow upgraded from 1.5.0 to 1.7.0

For more information, see the complete Databricks Runtime 6.5 for ML (EoS) release notes.

Databricks Runtime 6.5 for Genomics GA

April 14, 2020

Databricks Runtime 6.5 for Genomics is built on top of Databricks Runtime 6.5.

Authenticate to S3 buckets automatically using your IAM credentials (Public Preview)

April 9-14, 2020: Version 3.17

IAM credential passthrough allows you to authenticate automatically to S3 buckets from Databricks clusters by using the identity that you use to log in to Databricks. When you enable your cluster for IAM credential passthrough, commands that you run on that cluster can read and write data in S3 using your identity. IAM credential passthrough has two key benefits over securing access to S3 buckets using instance profiles:

  • IAM credential passthrough allows multiple users with different data access policies to share one Databricks cluster to access data in S3 while always maintaining data security. An instance profile can be associated with only one IAM role. This requires all users on a Databricks cluster to share that role and the data access policies of that role.

  • IAM credential passthrough associates a user with an identity. This in turn enables S3 object logging via CloudTrail. All S3 access is tied directly to the user via the ARN in CloudTrail logs.

For details, see Access S3 with IAM credential passthrough with SCIM (legacy).

Note

This feature is not available in all workspaces. Contact your Databricks sales representative for information about whether the feature is available for your workspace.

IAM role renamed to instance profile

April 9-14, 2020: Version 3.17

Databricks has changed the IAM Role label in the web application to Instance Profile. This is in line with AWS terminology and is consistent with the Databricks Instance Profile API. You will see this change in your workspace and in our documentation.

Easier notebook title changes

April 9-14, 2020: Version 3.17

You can now change the title of an open notebook by clicking the title and editing inline instead of clicking File > Rename.

Cluster termination reporting enhancement

April 9-14, 2020: Version 3.17

When clusters are terminated, they now return a type field that indicates the reason why the cluster was terminated. See Clusters API.

DBFS REST API delete endpoint size limit

From May 5th, 2020, when you delete a large number of files recursively using the DBFS API, the delete operation will be done in increments. The call returns a response after approximately 45s with an error message asking the user to re-invoke the delete operation until the directory structure is fully deleted. For example:

{
  "error_code":"PARTIAL_DELETE","message":"The requested operation has deleted 324 files. There are more files remaining. You must make another request to delete more."
}

Databricks Runtime 6.0 and 6.0 ML support ends

April 1, 2020

Support for Databricks Runtime 6.0 and Databricks Runtime 6.0 for Machine Learning ended on April 1. See Databricks support lifecycles.