Deploy and serve models

Deploy models to production with MLflow

For information about deploying models with MLflow, see Log, load, and deploy MLflow Models. The following notebook illustrates how to use MLflow Model Registry to build, manage, and deploy a model.

MLflow Model Registry example

Serve models with MLflow

Databricks provides MLflow Model Serving, which allows you to host machine learning models from the Model Registry as REST endpoints that are updated automatically based on the availability of model versions and their stages. MLflow Model Serving is available for Python MLflow models.

Run a Databricks job

You can create a Databricks job to run a notebook or JAR either immediately or on a scheduled basis.