November 2019

These features and Databricks platform improvements were released in November 2019.

Note

Releases are staged. Your Databricks account may not be updated until a week or more after the initial release date.

Databricks Runtime 6.2 ML Beta

November 15, 2019

Databricks Runtime 6.2 ML Beta brings many library upgrades, including:

  • TensorFlow and TensorBoard: 1.14.0 to 1.15.0.

  • PyTorch: 1.2.0 to 1.3.0.

  • tensorboardX: 1.8 to 1.9.

  • MLflow: 1.3.0 to 1.4.0.

  • Hyperopt: 0.2-db1 with Databricks MLflow integrations.

  • mleap-databricks-runtime to 0.15.0 and includes mleap-xgboost-runtime.

For more information, see the complete Databricks Runtime 6.2 for ML (unsupported) release notes.

Databricks Runtime 6.2 Beta

November 15, 2019

Databricks Runtime 6.2 Beta brings new features, improvements, and many bug fixes, including:

  • Delta Lake insert-only merge optimized

  • Redshift connector adds multi-region support for reads

For more information, see the complete Databricks Runtime 6.2 (unsupported) release notes.

Configure clusters with your own container image using Databricks Container Services

November 7 - 19, 2019: Version 3.6

Generally available in Databricks Runtime 6.1 and Databricks Platform version 3.6, Databricks Container Services lets you configure a cluster with your own container image. You can pre-package complex environments within a container, publish it to a popular container registry such as ACR, ECR, or Docker Hub, and then have Databricks pull the image to build a cluster. Some example use cases include:

  • Library customization - you have full control over the system libraries you want installed

  • Golden container environment - your Docker image is a locked down environment that will never change

  • Docker CI/CD integration - you can integrate Databricks with your Docker CI/CD pipelines

There are many other use cases, ranging from specifying configuration to installing machine learning packages.

For details, see Customize containers with Databricks Container Service.

Cluster detail now shows only cluster ID in the HTTP path

November 7 - 19, 2019: Version 3.6

When you connect a BI tool to Databricks using JDBC or ODBC, you should use the cluster ID variant of the HTTP path, because it is unique. The cluster name option no longer appears on the JDBC/ODBC tab on the cluster detail page.

Secrets referenced by Spark configuration properties and environment variables (Public Preview)

November 7, 2019

Available in Databricks Runtime 6.1 and above.

As of the November 7 maintenance update of Databricks Runtime 6.1, the ability to reference a secret in a Spark configuration property or environment variable is in Public Preview. For details, see Use a secret in a Spark configuration property or environment variable.