Libraries CLI

You run Databricks libraries CLI subcommands by appending them to databricks libraries.

databricks libraries -h
Usage: databricks libraries [OPTIONS] COMMAND [ARGS]...

  Utility to interact with libraries.

Options:
  -v, --version  [VERSION]
  -h, --help     Show this message and exit.

Commands:
  all-cluster-statuses  Get the status of all libraries.
  cluster-status        Get the status of all libraries for a cluster.
    Options:
      --cluster-id CLUSTER_ID   Can be found in the URL at https://<databricks-instance>/?o=<16-digit-number>#/setting/clusters/$CLUSTER_ID/configuration.
  install               Install a library on a cluster.
    Options:
      --cluster-id CLUSTER_ID   Can be found in the URL at https://<databricks-instance>/?o=<16-digit-number>#/setting/clusters/$CLUSTER_ID/configuration.
      --jar TEXT                JAR on DBFS or S3 or WASB.
      --egg TEXT                Egg on DBFS or S3 or WASB.
      --whl TEXT                Wheel or zipped wheelhouse on DBFS or S3 or WASB. Supported in CLI 0.8.2 and above.
                                Recommended for clusters running Databricks Runtime 4.2 or above.
      --maven-coordinates TEXT  Maven coordinates in the form of GroupId:ArtifactId:Version (i.e.org.jsoup:jsoup:1.7.2).
      --maven-repo TEXT         Maven repository to install the Maven package from. If omitted, both Maven Repository and Spark Packages are searched.
      --maven-exclusion TEXT    List of dependences to exclude. For example: --maven-exclusion "slf4j:slf4j" --maven-exclusion "*:hadoop-client".
      --pypi-package TEXT       The name of the PyPI package to install. An optional exact version specification is also supported. Examples "simplejson" and "simplejson==3.8.0".
      --pypi-repo TEXT          The repository where the package can be found. If not specified, the default pip index is used.
      --cran-package TEXT       The name of the CRAN package to install.
      --cran-repo TEXT          The repository where the package can be found. If not specified, the default CRAN repo is used.
  list                  Shortcut to `all-cluster-statuses` or `cluster-status`.
    Options:
      --cluster-id CLUSTER_ID   Can be found in the URL at https://<databricks-instance>/?o=<16-digit-number>#/setting/clusters/$CLUSTER_ID/configuration.
  uninstall             Uninstall a library on a cluster.
    Options:
      --cluster-id CLUSTER_ID   Can be found in the URL at https://<databricks-instance>/?o=<16-digit-number>#/setting/clusters/$CLUSTER_ID/configuration. [required]
      --all                     Uninstall all libraries.
      --jar TEXT                JAR on DBFS or S3 or WASB.
      --egg TEXT                Egg on DBFS or S3 or WASB.
      --whl TEXT                Wheel or zipped wheelhouse on DBFS or S3 or WASB. Supported in CLI 0.8.2 and above.
                                Recommended for clusters running Databricks Runtime 4.2 or above.
      --maven-coordinates TEXT  Maven coordinates in the form of GroupId:ArtifactId:Version (i.e.org.jsoup:jsoup:1.7.2).
      --maven-repo TEXT         Maven repository to install the Maven package from. If omitted, both Maven Repository and Spark Packages are searched.
      --maven-exclusion TEXT    List of dependences to exclude. For example: --maven-exclusion "slf4j:slf4j" --maven-exclusion "*:hadoop-client".
      --pypi-package TEXT       The name of the PyPI package to install. An optional exact version specification is also supported. Examples "simplejson" and "simplejson==3.8.0".
      --pypi-repo TEXT          The repository where the package can be found. If not specified, the default pip index is used.
      --cran-package TEXT       The name of the CRAN package to install.
      --cran-repo TEXT          The repository where the package can be found. If not specified, the default CRAN repo is used.

Install a JAR from DBFS

databricks libraries install --cluster-id $CLUSTER_ID --jar dbfs:/test-dir/test.jar

List library statuses for a cluster

databricks libraries list --cluster-id $CLUSTER_ID