Migrate to Databricks Connect for Scala

Note

Databricks Connect for Databricks Runtime 13.3 LTS and above for Scala is in Public Preview.

This article describes how to migrate from Databricks Connect for Databricks Runtime 12.2 LTS and below to Databricks Connect for Databricks Runtime 13.3 LTS and above for Scala. Databricks Connect enables you to connect popular IDEs, notebook servers, and custom applications to Databricks clusters. See What is Databricks Connect?. For the Python version of this article, see Migrate to Databricks Connect for Python.

Note

Before you begin to use Databricks Connect, you must set up the Databricks Connect client.

  1. Install the correct version of the Java Development Kit (JDK) and Scala as listed in the installation requirements to match your Databricks cluster, if it is not already installed locally.

  2. In your Scala project’s build file such as build.sbt for sbt, pom.xml for Maven, or build.gradle for Gradle, update the following reference to the Databricks Connect client:

    libraryDependencies += "com.databricks" % "databricks-connect" % "14.0.0"
    
    <dependency>
      <groupId>com.databricks</groupId>
      <artifactId>databricks-connect</artifactId>
      <version>14.0.0</version>
    </dependency>
    
    implementation 'com.databricks.databricks-connect:14.0.0'
    

    Replace 14.0.0 with the version of the Databricks Connect library that matches the Databricks Runtime version on your cluster. You can find the Databricks Connect library version numbers in the Maven central repository.

  3. Update your Scala code to initialize the spark variable (which represents an instantiation of the DatabricksSession class, similar to SparkSession in Spark). For code examples, see Code examples for Databricks Connect for Scala.