Skip to main content

Step 4: Create the log delivery configuration

This article describes how to call the log delivery API using the Databricks CLI. This is the last step in the audit log delivery configuration.

Create the log delivery configuration

Pass the following values when creating your log-delivery configuration:

  • log_type: Set to AUDIT_LOGS.
  • output_format: Set to JSON.
  • config_name: New unique name for your log-delivery configuration.
  • credentials_id: Your Databricks credential configuration ID, which represents your cross-account role credentials.
  • storage_configuration_id: Your Databricks storage configuration ID, which represents your root S3 bucket.
  • delivery_path_prefix: (Optional) Set to the path prefix. This must match the path prefix that you used in your role policy. The delivery path is <bucket-name>/<delivery-path-prefix>/workspaceId=<workspaceId>/date=<yyyy-mm-dd>/auditlogs_<internal-id>.json. If you configure audit log delivery for the entire account, account-level audit events that are not associated with any single workspace are delivered to the workspaceId=0 partition.
  • workspace_ids_filter: (Optional) To ensure delivery of account-level events, including Unity Catalog and Delta Sharing events, leave workspace_ids_filter empty. If you only want logs for select workspaces, set to an array of workspace IDs (each one is an int64). If you add specific workspace IDs in this field, you won’t receive account-level logs and or logs for workspaces created in the future.

Here is an example CLI command that creates a new log delivery configuration:

Bash
databricks account log-delivery create --json '{
"log_delivery_configuration": {
"log_type": "AUDIT_LOGS",
"config_name": "audit log config",
"output_format": "JSON",
"credentials_id": "<databricks-credentials-id>",
"storage_configuration_id": "<databricks-storage-config-id>",
"delivery_path_prefix": "auditlogs-data",
"workspace_ids_filter": [
6383650456894062,
4102272838062927
]
}
}'

Example response

JSON
{
"log_delivery_configuration": {
"config_id": "<config-id>",
"config_name": "audit log config",
"log_type": "AUDIT_LOGS",
"output_format": "JSON",
"account_id": "<account-id>",
"credentials_id": "<databricks-credentials-id>",
"storage_configuration_id": "<databricks-storage-config-id>",
"workspace_ids_filter": [6383650456894062, 4102272838062927],
"delivery_path_prefix": "auditlogs-data",
"status": "ENABLED",
"creation_time": 1591638409000,
"update_time": 1593108904000,
"log_delivery_status": {
"status": "CREATED",
"message": "Log Delivery Configuration is successfully created. Status will be updated after the first delivery attempt."
}
}
}
note

After initial setup or other log delivery configuration changes, expect a delay of up to one hour until changes take effect. After logging delivery begins, auditable events are typically logged within 15 minutes.

Next steps

Once you’ve configured your audit log delivery, learn more about the log schema and available logs by referencing the Audit log reference.