Skip to main content

View data quality monitoring expenses

To check data quality monitoring expenses, query the system table system.billing.usage. For more information on querying billing records, see Billable usage system table reference.

Anomaly detection expenses

To view only anomaly detection expenses, use the filter usage_metadata.schema_id is NOT NULL. Anomaly detection is enabled at the schema level, so checking for a non-null schema_id identifies costs related to anomaly detection.

SQL
SELECT usage_date, sum(usage_quantity) as dbus
FROM system.billing.usage
WHERE
usage_date >= DATE_SUB(current_date(), 30) AND
billing_origin_product = "DATA_QUALITY_MONITORING" AND
usage_metadata.schema_id is NOT NULL
GROUP BY usage_date
ORDER BY usage_date DESC

Data profiling expenses

To check expenses, use a query or the billing portal.

View usage from the system table system.billing.usage

You can check data profiling expenses using the system table system.billing.usage. Data profiling is billed under a serverless jobs SKU but does not require your account to be enabled for serverless compute for workflows.

For more information on querying billing records, see Billable usage system table reference.

SQL
SELECT usage_date, sum(usage_quantity) as dbus
FROM system.billing.usage
WHERE
usage_date >= DATE_SUB(current_date(), 30) AND
sku_name like "%JOBS_SERVERLESS%" AND
custom_tags["LakehouseMonitoring"] = "true"
GROUP BY usage_date
ORDER BY usage_date DESC

View usage from the billing portal

You can also check data profiling expenses using the billing portal.

  1. Log in to Databricks account console.
  2. In the sidebar, click the Usage icon.
  3. On the Usage page, select Consumption.
  4. Select Setup dashboard icon.
  5. In the Tag Key drop-down menu, select LakehouseMonitoring.

track monitoring expenses GCP