Platform administration cheat sheet
This article aims to provide clear and opinionated guidance for account and workspace admins on recommended best practices. The following practices should be implemented by account or workspace admins to help optimize cost, observability, data governance, and security in their Databricks account.
For in-depth security best practices, see this PDF: Databricks AWS Security Best Practices and Threat Model.
Best practice |
Impact |
Docs |
---|---|---|
Enable Unity Catalog |
Data governance: Unity Catalog provides centralized access control, auditing, lineage, and data discovery capabilities across Databricks workspaces. |
|
Use cluster policies |
Cost: Control costs with auto-termination (for all-purpose clusters), max cluster sizes, and instance type restrictions. Observability: Set Security: Restrict cluster access mode to only allow users to create Unity Catalog-enabled clusters to enforce data permissions. |
|
Use Service Principals to connect to third-party software |
Security: A service principal is a Databricks identity type that allows third-party services to authenticate directly to Databricks, not through an individual user’s credentials. If something happens to an individual user’s credentials, the third-party service won’t be interrupted. |
|
Set up SSO |
Security: Instead of having users type their email to log into a workspace, set up Databricks SSO so users can authenticate via your identity provider. |
|
Set up SCIM integration |
Security: Instead of adding users to Databricks manually, integrate with your identity provider to automate user provisioning and deprovisioning. When a user is removed from the identity provider, they are automatically removed from Databricks too. |
|
Manage access control with account-level groups |
Data governance: Create account-level groups so you can bulk control access to workspaces, resources, and data. This saves you from having to grant all users access to everything or grant individual users specific permissions. You can also sync groups from your identity provider to Databricks groups. |
|
Set up IP access for IP whitelisting |
Security: IP access lists prevent users from accessing Databricks resources in unsecured networks. Accessing a cloud service from an unsecured network can pose security risks to an enterprise, especially when the user may have authorized access to sensitive or personal data Make sure to set up IP access lists for your account console and workspaces. |
|
Configure a customer-managed VPC with regional endpoints |
Security: You can use a customer-managed VPC to exercise more control over your network configurations to comply with specific cloud security and governance standards your organization might require. Cost: Regional VPC endpoints to AWS services have a more direct connections and reduced cost compared to AWS global endpoints. |
|
Use Databricks Secrets or a cloud provider secrets manager |
Security: Using Databricks secrets allows you to securely store credentials for external data sources. Instead of entering credentials directly into a notebook, you can simply reference a secret to authenticate to a data source. |
|
Set expiration dates on personal access tokens (PATs) |
Security: Workspace admins can manage PATs for users, groups, and service principals. Setting expiration dates for PATs reduces the risk of lost tokens or long-lasting tokens that could lead to data exfiltration from the workspace. |
|
Use system tables to monitor account usage |
Observability: System tables are a Databricks-hosted analytical store of your account’s operational data, including audit logs, data lineage, and billable usage. You can use system tables for observability across your account. |