For information about deploying models with MLflow, see Log, load, and deploy MLflow Models. The following notebook illustrates how to use MLflow Model Registry to build, manage, and deploy a model.
Databricks provides MLflow Model Serving, which allows you to host machine learning models from the Model Registry as REST endpoints that are updated automatically based on the availability of model versions and their stages. MLflow Model Serving is available for Python MLflow models.
You can create a Databricks job to run a notebook or JAR either immediately or on a scheduled basis.