MLflow Tracing Integrações
MLflow Tracing está integrado a uma grande variedade de estruturas e biblioteca Generative AI populares, oferecendo uma experiência de rastreamento automático de uma linha para todos eles. Isso permite que você obtenha observabilidade imediata em seus aplicativos GenAI com configuração mínima.
O rastreamento automático captura a lógica e as etapas intermediárias do seu aplicativo, como chamadas LLM, uso de ferramentas e interações de agentes, com base na sua implementação da biblioteca ou do SDK específico.
Para saber mais sobre como o rastreamento automático funciona, seus pré-requisitos e exemplos de combinação com o rastreamento manual, consulte o guia principal sobre rastreamento automático. Os exemplos rápidos abaixo destacam algumas das principais integrações. Guias detalhados para cada biblioteca suportada, abrangendo pré-requisitos, exemplos avançados e comportamentos específicos, estão disponíveis em suas respectivas páginas nesta seção.
Visão geral das principais integrações
Aqui estão exemplos rápidos de algumas das integrações mais comumente usadas. Clique em tab para ver um exemplo básico de uso. Para obter pré-requisitos detalhados e cenários mais avançados para cada um, visite suas páginas de integração dedicadas (vinculadas a partir da guia ou da lista abaixo).
- OpenAI
- LangChain
- LangGraph
- Anthropic
- DSPy
- Databricks
- Bedrock
- AutoGen
import mlflow
import openai
# If running this code outside of a Databricks notebook (e.g., locally),
# uncomment and set the following environment variables to point to your Databricks workspace:
# import os
# os.environ["DATABRICKS_HOST"] = "https://your-workspace.cloud.databricks.com"
# os.environ["DATABRICKS_TOKEN"] = "your-personal-access-token"
# Enable auto-tracing for OpenAI
mlflow.openai.autolog()
# Set up MLflow tracking
mlflow.set_tracking_uri("databricks")
mlflow.set_experiment("/Shared/openai-tracing-demo")
openai_client = openai.OpenAI()
messages = [
{
"role": "user",
"content": "What is the capital of France?",
}
]
response = openai_client.chat.completions.create(
model="gpt-4o-mini",
messages=messages,
temperature=0.1,
max_tokens=100,
)
# View trace in MLflow UI
import mlflow
from langchain.prompts import PromptTemplate
from langchain_core.output_parsers import StrOutputParser
from langchain_openai import ChatOpenAI
# If running this code outside of a Databricks notebook (e.g., locally),
# uncomment and set the following environment variables to point to your Databricks workspace:
# import os
# os.environ["DATABRICKS_HOST"] = "https://your-workspace.cloud.databricks.com"
# os.environ["DATABRICKS_TOKEN"] = "your-personal-access-token"
mlflow.langchain.autolog()
mlflow.set_tracking_uri("databricks")
mlflow.set_experiment("/Shared/langchain-tracing-demo")
llm = ChatOpenAI(model="gpt-4o-mini", temperature=0.7, max_tokens=1000)
prompt = PromptTemplate.from_template("Tell me a joke about {topic}.")
chain = prompt | llm | StrOutputParser()
chain.invoke({"topic": "artificial intelligence"})
# View trace in MLflow UI
import mlflow
from langchain_core.tools import tool
from langchain_openai import ChatOpenAI
from langgraph.prebuilt import create_react_agent
# If running this code outside of a Databricks notebook (e.g., locally),
# uncomment and set the following environment variables to point to your Databricks workspace:
# import os
# os.environ["DATABRICKS_HOST"] = "https://your-workspace.cloud.databricks.com"
# os.environ["DATABRICKS_TOKEN"] = "your-personal-access-token"
mlflow.langchain.autolog() # LangGraph uses LangChain's autolog
mlflow.set_tracking_uri("databricks")
mlflow.set_experiment("/Shared/langgraph-tracing-demo")
@tool
def get_weather(city: str):
"""Use this to get weather information."""
return f"It might be cloudy in {city}"
llm = ChatOpenAI(model="gpt-4o-mini")
graph = create_react_agent(llm, [get_weather])
result = graph.invoke({"messages": [("user", "what is the weather in sf?")]})
# View trace in MLflow UI
import mlflow
import anthropic
import os
# If running this code outside of a Databricks notebook (e.g., locally),
# uncomment and set the following environment variables to point to your Databricks workspace:
# os.environ["DATABRICKS_HOST"] = "https://your-workspace.cloud.databricks.com"
# os.environ["DATABRICKS_TOKEN"] = "your-personal-access-token"
mlflow.anthropic.autolog()
mlflow.set_tracking_uri("databricks")
mlflow.set_experiment("/Shared/anthropic-tracing-demo")
client = anthropic.Anthropic(api_key=os.environ.get("ANTHROPIC_API_KEY"))
message = client.messages.create(
model="claude-3-5-sonnet-20241022",
max_tokens=1024,
messages=[{"role": "user", "content": "Hello, Claude"}],
)
# View trace in MLflow UI
import mlflow
import dspy
# If running this code outside of a Databricks notebook (e.g., locally),
# uncomment and set the following environment variables to point to your Databricks workspace:
# import os
# os.environ["DATABRICKS_HOST"] = "https://your-workspace.cloud.databricks.com"
# os.environ["DATABRICKS_TOKEN"] = "your-personal-access-token"
mlflow.dspy.autolog()
mlflow.set_tracking_uri("databricks")
mlflow.set_experiment("/Shared/dspy-tracing-demo")
lm = dspy.LM("openai/gpt-4o-mini") # Assumes OPENAI_API_KEY is set
dspy.configure(lm=lm)
class SimpleSignature(dspy.Signature):
input_text: str = dspy.InputField()
output_text: str = dspy.OutputField()
program = dspy.Predict(SimpleSignature)
result = program(input_text="Summarize MLflow Tracing.")
# View trace in MLflow UI
import mlflow
import os
from openai import OpenAI # Databricks FMAPI uses OpenAI client
# If running this code outside of a Databricks notebook (e.g., locally),
# uncomment and set the following environment variables to point to your Databricks workspace:
# os.environ["DATABRICKS_HOST"] = "https://your-workspace.cloud.databricks.com"
# os.environ["DATABRICKS_TOKEN"] = "your-personal-access-token"
mlflow.openai.autolog() # Traces Databricks FMAPI via OpenAI client
mlflow.set_tracking_uri("databricks")
mlflow.set_experiment("/Shared/databricks-fmapi-tracing")
client = OpenAI(
api_key=os.environ.get("DATABRICKS_TOKEN"),
base_url=f"{os.environ.get('DATABRICKS_HOST')}/serving-endpoints"
)
response = client.chat.completions.create(
model="databricks-llama-4-maverick",
messages=[{"role": "user", "content": "Key features of MLflow?"}],
)
# View trace in MLflow UI
import mlflow
import boto3
# If running this code outside of a Databricks notebook (e.g., locally),
# uncomment and set the following environment variables to point to your Databricks workspace:
# import os
# os.environ["DATABRICKS_HOST"] = "https://your-workspace.cloud.databricks.com"
# os.environ["DATABRICKS_TOKEN"] = "your-personal-access-token"
mlflow.bedrock.autolog()
mlflow.set_tracking_uri("databricks")
mlflow.set_experiment("/Shared/bedrock-tracing-demo")
bedrock = boto3.client(
service_name="bedrock-runtime",
region_name="us-east-1" # Replace with your region
)
response = bedrock.converse(
modelId="anthropic.claude-3-5-sonnet-20241022-v2:0",
messages=[{"role": "user", "content": "Hello World in one line."}]
)
# View trace in MLflow UI
import mlflow
from autogen import ConversableAgent
import os
# If running this code outside of a Databricks notebook (e.g., locally),
# uncomment and set the following environment variables to point to your Databricks workspace:
# os.environ["DATABRICKS_HOST"] = "https://your-workspace.cloud.databricks.com"
# os.environ["DATABRICKS_TOKEN"] = "your-personal-access-token"
mlflow.autogen.autolog()
mlflow.set_tracking_uri("databricks")
mlflow.set_experiment("/Shared/autogen-tracing-demo")
config_list = [{"model": "gpt-4o-mini", "api_key": os.environ.get("OPENAI_API_KEY")}]
assistant = ConversableAgent("assistant", llm_config={"config_list": config_list})
user_proxy = ConversableAgent("user_proxy", human_input_mode="NEVER", code_execution_config=False)
user_proxy.initiate_chat(assistant, message="What is 2+2?")
# View trace in MLflow UI
Habilitando várias integrações de rastreamento automático
Como os aplicativos GenAI geralmente combinam várias bibliotecas, o site MLflow Tracing permite que o senhor ative o rastreamento automático para várias integrações simultaneamente, proporcionando uma experiência de rastreamento unificada.
Por exemplo, para ativar o LangChain e o rastreamento direto do OpenAI:
import mlflow
# Enable MLflow Tracing for both LangChain and OpenAI
mlflow.langchain.autolog()
mlflow.openai.autolog()
# Your code using both LangChain and OpenAI directly...
# ... an example can be found on the Automatic Tracing page ...
O MLflow gerará um rastreamento único e coeso que combina as etapas das chamadas LangChain e LLM diretas do OpenAI, permitindo que o senhor inspecione o fluxo completo. Mais exemplos de combinação de integrações podem ser encontrados na página Rastreamento automático.
Desativando o rastreamento automático
O rastreamento automático de qualquer biblioteca específica pode ser desativado chamando mlflow.<library>.autolog(disable=True)
.
Para desativar todas as integrações de registro automático de uma só vez, use mlflow.autolog(disable=True)
.
import mlflow
# Disable for a specific library
mlflow.openai.autolog(disable=True)
# Disable all autologging
mlflow.autolog(disable=True)