November 2021

These features and Databricks platform improvements were released in November 2021.

Note

Releases are staged. Your Databricks account may not be updated until a week or more after the initial release date.

Create tags for feature tables (Public Preview)

November 30 - December 6, 2021: Version 3.60

You can now create tags for feature tables and use them to search.

Syntax highlighting and autocomplete for SQL commands in Python cells

November 30 - December 6, 2021: Version 3.60

Syntax highlighting and SQL autocomplete are available when you use SQL inside a Python command, such as in a spark.sql command.

Rename, delete, and change permissions for MLflow experiments from experiment page (Public Preview)

November 30 - December 6, 2021: Version 3.60

You can now rename, delete, and change permissions for an MLflow experiment from its experiment page. For details, see Manage experiments.

New data profiles in notebooks: tabular and graphic summaries of your data (Public Preview)

November 30 - December 6, 2021: Version 3.60

When you use display(<dataframe>) in Scala or Python, or run a SQL query, the results pane shows a new tab Data Profile that presents an interactive tabular and graphic summary of the DataFrame or table. For details, see Create a new data profile and Create a new visualization.

You can also use the Databricks utilities command dbutils.data.summarize.

Improved logging when schemas evolve while running a Delta Live Tables pipeline

November 30 - December 6, 2021: Version 3.60

If your Delta Live Tables pipeline reads data with Auto Loader and the schema of the input data changes while an update is running, the update is logged as CANCELED and automatically retried. The new update is logged with the state SCHEMA_CHANGE. Previously, when the input schema changed, the update would be logged as FAILED even though Delta Live Tables automatically retries the update.

Databricks Partner Connect GA

November 18, 2021

Databricks Partner Connect is now generally available. You can use Partner Connect to quickly and easily discover and connect popular data and AI tools to your Databricks lakehouse. See What is Databricks Partner Connect?.

Breaking change: remove escaping and quotes from $ in environment variable values for cluster creation

November 15-30, 2021: Version 3.59

Note

This change will be reverted on December 3, 2021 from 01:00-03:00 UTC.

When you create a cluster, you can specify environment variables. Before this change, to use a $ within an environment variable value, you needed to escape it or surround it in quotes. With this change, escape characters or quotes are no longer needed. This change introduces a breaking behavior, as escaped or quoted $ characters in environment variable values are now no longer interpreted as $. For example, now, given the following environment variable declarations:

WITH_ESCAPING=\\$123
WITH_DOUBLE_QUOTING="$123"
NO_ESCAPING=$123
WITH_SINGLE_QUOTING='$123'
NO_QUOTING=$123

Printing these values from a notebook, for example:

%sh
echo ${WITH_ESCAPING}
echo ${WITH_DOUBLE_QUOTING}
echo ${NO_ESCAPING}
echo ${WITH_SINGLE_QUOTING}
echo ${NO_QUOTING}

Returns the following values:

\\$123
"$123"
$123
driver23
$123

Note

This change applies only to environment variables. It does not apply to secrets.

Ease of use improvements for Files in Repos

November 15-30, 2021: Version 3.59

New capabilities have been added to the files text editor:

  • A close tag is automatically generated when you create a tag.

  • Code folding is now available.

In addition, for text files, you can now copy the full repo path in addition to the path from the repo root. For details, see Programmatically interact with workspace files.

Support for legacy SQL widgets ends on January 15, 2022

November 15, 2021

Databricks will end support for legacy SQL widgets on January 15, 2022. After that date, notebooks will no longer render legacy SQL widgets in the UI, but parameters passed through %run will still work with the legacy SQL widget API. To ensure that your widgets continue to render in the UI, update your code to use the SQL widgets.

User interface improvements for Databricks jobs

November 15-30, 2021: Version 3.59

This release includes the following enhancements to the jobs UI:

  • You can now clone a single task, making it easier to add new tasks to an existing job. See Create a task from an existing task.

  • When viewing job run details or task run details, the Job ID and Job run ID are now links, allowing you to easily switch between job detail views. See View job run details.

  • You can now copy the path to a task, for example a notebook path, when viewing or editing a task. See Copy a task path.

Delta Sharing Connector for Power BI

November 15, 2021

We have released the Power BI Delta Sharing connector which allows users to discover, analyze and visualize shared datasets through the Delta Sharing open protocol. The protocol enables secure exchange of datasets across products and platforms by leveraging REST and cloud storage. For details, see Power BI Delta Sharing connector.

Databricks ODBC driver 2.6.19

November 12, 2021

We have released version 2.6.19 of the Databricks ODBC driver (download). The new driver propagates SQL error conditions (SQLState) returned by Databricks to the client.

Databricks Runtime 10.1 and 10.1 ML are GA; 10.1 Photon is Public Preview

November 10, 2021

Databricks Runtime 10.1 and 10.1 ML are now generally available. 10.1 Photon is in Public Preview.

See Databricks Runtime 10.1 (unsupported) and Databricks Runtime 10.1 for ML (unsupported).

Databricks Runtime 10.1 (Beta)

November 4, 2021

Databricks Runtime 10.1, 10.1 Photon, and 10.1 ML are now available as Beta releases.

See the full release notes at Databricks Runtime 10.1 (unsupported) and Databricks Runtime 10.1 for ML (unsupported).

Rename and delete MLflow experiments (Public Preview)

November 2-8, 2021: Version 3.58

You can now rename, delete, and change permissions for an MLflow experiment from the Experiments page.

Photon support for additional cluster instance families

November 2-8, 2021: Version 3.58

Support for the Photon query engine has been added to an expanded list of i3 and i3en instance types, as well as the following cluster instance families:

  • m5d

  • m5dn

  • r5d

  • r5dn

When you create a cluster using the UI, you can select a Photon-enabled instance type from among the instances available to you. When you create a cluster using the REST API or the CLI, set spark_version to a Photon-enabled runtime version by using the syntax <databricks-runtime-version>-photon-scala-2.12.

You can now create a cluster policy by cloning an existing policy

November 2-8, 2021: Version 3.58

Support has been added for creating a new cluster policy by cloning an existing policy. See Create and manage compute policies.

Single sign-on (SSO) in the account console is Generally Available

November 2-8, 2021: Version 3.58

Single sign-on (SSO) in the account console is now Generally Available. Account administrators can now authenticate to the account console using SSO backed by your organization’s identity provider. The account must be on the E2 version of the Databricks platform. Most Databricks accounts are now on E2; if you are unsure, consult your Databricks account team.

Change the default language of notebooks and notebook cells more easily

November 2-8, 2021: Version 3.58

A new language button appears at the top of the notebook and in each cell. You can use this button to select the default language for the notebook, and to select the language for a specific cell. For more information, see Set default language and Mix languages.

Use Files in Repos from the web terminal

November 2-8, 2021: Version 3.58

You can now access Files in Repos from the web terminal by using the path /Workspace.