Account Console

The Account Console is where you administer your Databricks account-level configurations. Only the account owner who initially created the Databricks account can log in to the Account Console.

Access the Account Console

Click the user profile Account Icon icon at the top right and select Manage Account.

../../_images/manage-account.png

Billing Details

In the Account Console, click the Billing Details tab to view and configure billing details for your account.

AWS Account

In the Account Console, click the AWS Account tab to configure AWS access from Databricks.

For details, see:

AWS Storage

In the Account Console, click the AWS Storage tab to configure AWS storage settings for your account.

For details, see:

Audit Logs

In the Account Console, click the Audit Logs tab to configure audit logs for your account.

For details, see:

Databricks Account

In the Account Console, click the Deploy Databricks and Delete Account tabs to manage your Databricks account.

For details, see:

Usage

In the Account Console, click the Usage Overview tab and select a <month year> to see historical account usage grouped by workload type (Data Analytics, Data Engineering, Data Engineering Light).

../../_images/cluster-usage.png

Click the Download itemized usage button to download a CSV file containing the usage data. The file uses this schema:

Column Type Description Example
workspaceId string ID of the workspace. 1234567890123456
timestamp datetime End of the hour for the provided usage. 2019-02-22T09:59:59Z
clusterId string ID of the cluster. 0405-020048-brawl507
clusterName string User-provided name for the cluster. Shared Autoscaling
clusterNodeType string Instance type of the cluster. m4.16xlarge
clusterOwnerUserId string ID of the user who created the cluster. 12345678901234
clusterCustomTags string (“-escaped json) Custom tags associated with the cluster during this hour. "{""dept"":""mktg"",""op_phase"":""dev""}"
sku string

Billing SKU. One of:

  • STANDARD_INTERACTIVE_<suffix> - equivalent to Data Analytics
  • STANDARD_AUTOMATED_<suffix> - equivalent to Data Engineering
  • LIGHT_AUTOMATED_<suffix> - equivalent to Data Engineering Light

<suffix> is OPSEC or NON_OPSEC for Databricks Operational Security Package enabled or disabled.

STANDARD_AUTOMATED_OPSEC
dbus double Number of DBUs used by the user during this hour. 1.2345
machineHours double Total number of machine hours used by all containers in the cluster. 12.345

You can import the CSV file into Databricks using the Create New Table UI or use the following code to create the usage table:

df = (spark.
      read.
      option("header", "true").
      option("inferSchema", "true").
      option("escape", "\"").
      csv("/FileStore/tables/usage_data.csv"))

df.createOrReplaceTempView("usage")

Total DBUs are the sum of the dbus column. To map a user ID to a user email address, call the SCIM API to get a list of users. See the following notebook for an example of mapping the user ID to user email and summing the DBUs per user: