OpenTelemetry Export
Traces generated by MLflow are compatible with the OpenTelemetry trace specs. Therefore, MLflow traces can be exported to various observability solutions that support OpenTelemetry.
Export modes
MLflow supports three export modes for traces:
- MLflow tracking only (Default): Traces are sent only to the MLflow Tracking Server.
- OpenTelemetry only: Traces are sent only to an OpenTelemetry Collector.
- Dual export: Traces are sent to both MLflow Tracking and an OpenTelemetry Collector.
OpenTelemetry export
By default, MLflow exports traces to the MLflow Tracking Server. To export traces only to an OpenTelemetry Collector instead, set the OTEL_EXPORTER_OTLP_ENDPOINT
environment variable (or OTEL_EXPORTER_OTLP_TRACES_ENDPOINT
) to the target URL of the OpenTelemetry Collector before starting any trace.
import mlflow
import os
# Set the endpoint of the OpenTelemetry Collector
os.environ["OTEL_EXPORTER_OTLP_TRACES_ENDPOINT"] = "http://localhost:4317/v1/traces"
# Optionally, set the service name to group traces
os.environ["OTEL_SERVICE_NAME"] = "<your-service-name>"
# Trace will be exported ONLY to the OTel collector at http://localhost:4317/v1/traces
with mlflow.start_span(name="foo") as span:
span.set_inputs({"a": 1})
span.set_outputs({"b": 2})
Dual export (MLflow + OpenTelemetry)
To export traces to both Databricks MLflow and another OpenTelemetry service simultaneously, use MLflow's dual export configuration. This enables sending the same trace data to multiple destinations without having to choose between MLflow's tracking capabilities and your existing observability infrastructure.
Enable dual export
Set the MLFLOW_ENABLE_DUAL_EXPORT
environment variable along with your OpenTelemetry configuration:
import mlflow
import os
# Enable dual export mode
os.environ["MLFLOW_ENABLE_DUAL_EXPORT"] = "true"
# Configure OpenTelemetry Collector endpoint
os.environ["OTEL_EXPORTER_OTLP_TRACES_ENDPOINT"] = "http://localhost:4317/v1/traces"
os.environ["OTEL_SERVICE_NAME"] = "my-ml-service"
# Configure MLflow tracking URI to Databricks
mlflow.set_tracking_uri("databricks")
# Traces will be exported to BOTH MLflow and the OTel collector
with mlflow.start_span(name="dual_export_example") as span:
span.set_inputs({"model": "gpt-4", "prompt": "Hello world"})
# Your ML workflow here
result = "Generated response"
span.set_outputs({"response": result})
span.set_attributes({"token_count": 15})
Metrics export
MLflow can export OpenTelemetry metrics when a metrics endpoint is configured. This allows you to monitor span durations and other trace-related metrics in compatible monitoring systems.
For a complete list of metrics that MLflow exports, see MLflow's exported metrics documentation.
Enable metrics export
To export metrics to OpenTelemetry Collector, set the following environment variables:
import os
# Enable metrics export
os.environ["OTEL_METRICS_EXPORTER"] = "otlp"
os.environ["OTEL_EXPORTER_OTLP_METRICS_ENDPOINT"] = "http://localhost:4317"
# Optional: Configure metric export interval (in milliseconds)
os.environ["OTEL_METRIC_EXPORT_INTERVAL"] = "60000" # Export every 60 seconds
OpenTelemetry Collectors
Click the following links to learn more about how to set up an OpenTelemetry Collector for your specific observability platform:
Platform | OpenTelemetry Documentation |
---|---|
Datadog | |
New Relic | |
SigNoz | |
Splunk | |
Grafana | |
ServiceNow (Lightstep) |
Configurations
MLflow uses the standard OTLP Exporter for exporting traces to OpenTelemetry Collector instances. Thereby, you can use all of the configurations supported by OpenTelemetry. The following example configures the OTLP Exporter to use HTTP protocol instead of the default gRPC and sets custom headers:
export OTEL_EXPORTER_OTLP_TRACES_ENDPOINT="http://localhost:4317/v1/traces"
export OTEL_EXPORTER_OTLP_TRACES_PROTOCOL="http/protobuf"
export OTEL_EXPORTER_OTLP_TRACES_HEADERS="api_key=12345"
Next Steps
- Understand tracing concepts - Learn how MLflow's OpenTelemetry-compatible traces are structured
- Instrument your app with tracing - Add custom spans to enrich your OpenTelemetry exports
- Debug production issues - Use exported traces to monitor production applications