Skip to main content

Configure telemetry for Databricks Apps

Beta

App telemetry is in Beta.

Databricks Apps telemetry collects traces, logs, and metrics and persists them to Unity Catalog tables using the OpenTelemetry (OTel) protocol. After you enable app telemetry, Databricks automatically captures system logs and usage events such as user login and direct API requests. You can also add custom instrumentation using the OpenTelemetry SDK for your framework.

Requirements

  • Your workspace must be in a supported region: ap-northeast-1, ap-northeast-2, ap-south-1, ap-southeast-1, ap-southeast-2, ca-central-1, eu-central-1, eu-west-1, eu-west-2, sa-east-1, us-east-1, us-east-2, us-west-2.
  • To create new telemetry target tables in Unity Catalog, you need CAN MANAGE permissions on the target catalog and schema, and CREATE TABLE on the schema.
  • To write to existing telemetry target tables in Unity Catalog, you need either CAN MANAGE permissions on the target catalog and schema, or all account users must have USE CATALOG, USE SCHEMA, SELECT, and MODIFY on the target tables.
  • Target tables must be managed Delta tables in the same region as your workspace.
  • Databricks recommends enabling predictive optimization on the telemetry target tables for better query performance.

Enable app telemetry

note

If you created an app before the app telemetry beta, you must stop and restart it before you proceed with the following configuration steps.

To turn on telemetry for an app, configure a catalog and schema for the telemetry tables in the app settings.

  1. Open the app details page in your Databricks workspace.
  2. On the overview tab, locate the App telemetry configuration section and click Add.
  3. Enter or browse to select a catalog and schema. Databricks writes telemetry data to three tables in the selected location: otel_metrics, otel_spans, and otel_logs.
  4. (Optional) Specify a table prefix so that tables are named <prefix>_otel_metrics, <prefix>_otel_spans, and <prefix>_otel_logs. Databricks appends to existing tables or creates them if they don't exist.
  5. Click Save.
  6. Redeploy the app so that telemetry starts flowing to Unity Catalog.

Verify telemetry data

The otel_logs table is populated automatically after redeployment. The otel_spans and otel_metrics tables are only populated after you add custom instrumentation to your app.

After you redeploy the app:

  1. Visit the app URL to generate activity.

  2. Wait a few seconds for the initial batch of data to appear.

  3. Run the following query in Databricks SQL to confirm data is flowing:

    SQL
    SELECT * FROM <catalog>.<schema>.otel_logs
    LIMIT 10;

Query telemetry data

Useful columns for filtering and correlating telemetry data include time, service_name, trace_id, span_id, and attributes. The attributes column is a map that contains event-specific metadata such as event.name.

To view the full schema of any telemetry table, run:

SQL
DESCRIBE TABLE <catalog>.<schema>.otel_logs;

The following example queries for error-level logs from the last hour, which is useful for debugging app issues:

SQL
SELECT time, body
FROM <catalog>.<schema>.otel_logs
WHERE service_name = '<app-name>'
AND severity_text = "ERROR"
AND time >= current_timestamp() - INTERVAL 1 HOUR
ORDER BY time DESC
LIMIT 100;

Query system events

Databricks automatically captures system events in the otel_logs table, such as usage events for user logins and direct API requests. Query these events by filtering on the event.name attribute.

The following example retrieves the 100 most recent usage events for an application:

SQL
SELECT time, attributes
FROM <catalog>.<schema>.otel_logs
WHERE service_name = '<app-name>'
AND attributes:["event.name"]::string = 'app.auth'
ORDER BY time DESC
LIMIT 100;

Add custom instrumentation

Add OpenTelemetry auto-instrumentation to generate custom traces, metrics, and logs. Update your app.yaml and dependency files as shown for your framework.

Update app.yaml:

YAML
command: ['opentelemetry-instrument', 'streamlit', 'run', 'app.py']
env:
- name: OTEL_TRACES_SAMPLER
value: 'always_on'

Update requirements.txt:

streamlit==1.38.0

# Auto-instrumentation
opentelemetry-distro
opentelemetry-exporter-otlp-proto-grpc

# Required for Streamlit
opentelemetry-instrumentation-tornado

# Host metrics (CPU, memory)
opentelemetry-instrumentation-system-metrics

Environment variables

When you enable app telemetry, Databricks automatically configures environment variables in your app runtime for the OTLP collector endpoint, export protocol, resource attributes, and batch processing. For the full list of OTel environment variables, see App telemetry environment variables.

Limits and limitations

App telemetry uses the Zerobus Ingest connector to write data to Unity Catalog tables. All Zerobus Ingest connector limitations apply to app telemetry, including limits on record size, throughput, delivery guarantees, and target table requirements.

In addition to Zerobus limits, app telemetry enforces a maximum log line size of 1 MB per log line.