Python ML model training with Unity Catalog data

Unity Catalog allows you to apply fine-grained security to tables and to securely access them from any language, all while interacting seamlessly with other machine-learning components in Databricks. This article shows how to use Python to train a machine-learning model using data in Unity Catalog.

Requirements

  • Your Databricks account must be on the Premium plan.

  • You must have the ability to create a cluster or access to a cluster running in a Unity Catalog compliant access mode.

Create a Databricks Machine Learning cluster

Follow these steps to create a Single-User Databricks Machine Learning cluster that can access data in Unity Catalog.

  1. Click compute icon Compute.

  2. Click Create cluster.

    1. Click ML.

    2. Select either 11.1 ML (Scala 2.12.14, Spark 3.3.0) or higher, or 11.1 ML (GPU, Scala 2.12.14, Spark 3.3.0) or higher.

  3. Click Access Mode. Set Single User or Shared depending on use.

    Shared clusters can be shared by multiple users, but only SQL and Python workloads are supported.

    To run workloads using Python, Scala, or R, set the access mode to single user. Single user clusters can also run SQL workloads. The cluster can be used exclusively by a single user (by default, the single user is the owner of the cluster) and other users can’t attach to the cluster.

    For more information about the features available in each access mode, see What is cluster access mode?.

  4. Click Create Cluster.

Create the catalog

Follow these steps to create a new catalog where your machine learning team can store their data assets.

  1. In a workspace with the metastore assigned, log in as the metastore admin, or as a user with the CREATE CATALOG privilege.

  2. Create a notebook or open the Databricks SQL editor.

  3. Run the following command to create the ml catalog:

    CREATE CATALOG ml;
    

    When you create a catalog, a schema named default is automatically created within it.

  4. Grant access to the ml catalog and the ml.default schema, and the ability to create tables and views, to the ml_team group. To include all account level users, you could use the group account users.

    GRANT USAGE ON CATALOG ml TO `ml team`;
    GRANT USAGE, CREATE ON SCHEMA ml.default TO `ml_team`;
    

Now, any user in the ml_team group can run the following example notebook.

Import the example notebook

To get started, import the following notebook.

Machine learning with Unity Catalog

Open notebook in new tab

To import the notebook:

  1. Next to the notebook, click Copy link for import.

  2. In your workspace, click Workspace Icon Workspace.

  3. Next to a folder, click Down Caret, then click Import

  4. Click URL, then paste in the link you copied.

  5. The imported notebook appears in the folder you selected. Double-click the notebook name to open it.

  6. At the top of the notebook, select your Databricks Machine Learning cluster to attach the notebook to it.

The notebook is divided into several high-level sections:

  1. Setup.

  2. Read data from CSV files and writing it to Unity Catalog.

  3. Load the data into Pandas dataframes and clean it up.

  4. Train a basic classification model.

  5. Tune hyperparameters and optimize the model.

  6. Write the results to a new table and share it with other users.

To run a cell, click Run Icon Run. To run the entire notebook, click Run All.