For models registered in Model Registry, you can automatically generate a notebook for batch inference or configure the model for online serving.
For scalable model inference with MLLib and XGBoost4J models, use the native
transform methods to perform inference directly on Spark DataFrames. See MLlib example notebooks for several example notebooks that include inference steps.
For other libraries and model types, create a Spark UDF to scale out inference on large datasets. For smaller datasets, use the native model inference routines provided by the library.
The following articles provide an introduction to deep learning model inference on Databricks.