Azure Cosmos DB


Azure Cosmos DB Spark Connector is developed by Microsoft. This connector is in preview and requires Databricks Runtime 3.4 or newer versions.

This page explains how to read and write data to Azure Cosmos DB from Databricks Runtime clusters.


To set up Azure Cosmos DB Spark connector:

  1. Download the following libraries in the form of jar files, following the links. Please note that you do not need to download dependencies of these libraries.
  1. Upload the downloaded jar files to Databricks following the guidance in Uploading Libraries.
  2. Attach the uploaded libraries to your cluster.

Using Azure Cosmos DB Spark Connector

The following notebook in Scala provides a simple example of how to write data to Cosmos DB and read data from Cosmos DB. See the Azure Cosmos DB Spark Connector project for detailed documentation. The Azure Cosmos DB Spark Connector User Guide, developed by Microsoft, also shows how to use this connector in Python.