Step 1: Configure audit log storage

This article explains how to set up an AWS S3 storage bucket for low-latency delivery of audit logs.

Create the S3 bucket

  1. Log into your AWS Console as a user with administrator privileges and go to the S3 service.

  2. Click the Create bucket button.

  3. In Bucket name, enter a name for your bucket. For more bucket naming guidance, see the AWS bucket naming rules.

  4. Click Create bucket.

Create a Databricks storage configuration record

Next, you need to create a Databricks storage configuration record that represents your new S3 bucket.

Specify your S3 bucket by calling the create new storage configuration API.

Pass the following values:

  • storage_configuration_name: New unique storage configuration name.

  • root_bucket_info: A JSON object that contains a bucket_name field that contains your S3 bucket name.

For example:

curl -X POST
    'https://accounts.cloud.databricks.com/api/2.0/accounts/<databricks-account-id>/storage-configurations' \
  --header 'Authorization: Bearer $OAUTH_TOKEN'  \
  -d '{
    "storage_configuration_name": "databricks-workspace-storageconf-v1",
    "root_bucket_info": {
      "bucket_name": "my-company-example-bucket"
    }
  }'

Response:

{
  "storage_configuration_id": "<databricks-storage-config-id>",
  "account_id": "<databricks-account-id>",
  "root_bucket_info": {
    "bucket_name": "my-company-example-bucket"
  },
  "storage_configuration_name": "databricks-workspace-storageconf-v1",
  "creation_time": 1579754875555
}

Copy the storage_configuration_id value returned in the response body. You’ll need it when you call the log delivery API.

Next steps

Next, configure an IAM role and create a credential in Databricks. See Step 2: Configure credentials for audit log delivery.