Best practices for security, compliance, and privacy
The Databricks Security Best Practices guide, including a downloadable PDF, is available on the Databricks Security & Trust Center. The sections in this article list the best practices from this guide along the principles of this pillar.
1. Manage identity and access using least privilege
Account setup and identity configuration
During deployment, configure Databricks account administration, SSO, and user provisioning to establish a secure foundation:
- Assign account admin roles to 2-3 trusted individuals only
- Configure SSO with Google Workspace or other identity providers using OIDC or SAML
- Enable SCIM provisioning to automate user and group synchronization from your identity provider
- Set up identity federation to link corporate identities across workspaces
- Configure multifactor authentication at the identity provider level
- Define emergency access procedures for account recovery
For step-by-step account setup procedures, see Phase 1: Design account and identity strategy.
Identity and access management best practices
- Leverage multi-factor authentication
- Use SCIM to synchronize users and groups
- Limit the number of admin users
- Enforce segregation of duties between administrative accounts
- Restrict workspace admins
- Manage access according to the principle of least privilege
- Use OAuth token authentication
- Enforce token management
- Restrict cluster creation rights
- Use compute policies
- Use service principals to run administrative tasks and production workloads
- Use compute that supports user isolation
- Store and use secrets securely
- Consider post-deployment hardening steps
Details are in the PDF referenced at the beginning of this article.
2. Protect data in transit and at rest
- Centralise data governance with Unity Catalog
- Plan your data isolation model
- Avoid storing production data in DBFS
- Secure your GCS buckets and prevent public access
- Use VPC Service Controls
- Protect your GCS data with soft delete
- Backup your GCS data with dual-regions
- Configure customer-managed keys for managed services
- Configure customer-managed keys for storage
- Use Delta Sharing
- Configure a Delta Sharing recipient token lifetime
- Additionally encrypt sensitive data at rest using Advanced Encryption Standard (AES)
- Leverage data exfiltration prevention settings within the workspace
Details are in the PDF referenced at the beginning of this article.
3. Secure your network and protect endpoints
Network deployment considerations for GCP
Deploy secure network infrastructure for Databricks workspaces on GCP. The following steps establish secure connectivity:
- Create a custom mode VPC (not auto mode) for workspace deployments
- Provision subnets with primary IP range and secondary ranges for GKE pods and services
- Configure firewall rules to allow internal traffic and restrict external access
- Set up Cloud NAT for outbound internet access from cluster nodes
- Deploy Private Service Connect (PSC) for private connectivity to Databricks control plane
- Configure Cloud VPN or Cloud Interconnect for on-premises connectivity (if required)
- Implement network segmentation to isolate production and non-production environments
For step-by-step GCP network configuration, see GCP network architecture.
Network security best practices
- Use a customer-managed VPC
- Configure IP access lists
- Use GCP Private Service Connect
- Implement network exfiltration protections
- Isolate sensitive workloads into different networks
- Configure a firewall for serverless compute access
- Restrict access to valuable codebases to only trusted networks
Details are in the PDF referenced at the beginning of this article.
4. Meet compliance and data privacy requirements
- Restart compute on a regular schedule
- Isolate sensitive workloads into different workspaces
- Assign Unity Catalog securables to specific workspaces
- Implement fine-grained access controls
- Apply tags
- Use lineage
- Control and monitor workspace access for Databricks personnel
- Implement and test a Disaster Recovery strategy
Details are in the PDF referenced at the beginning of this article.
5. Monitor system security
- Leverage system tables
- Monitor system activities via GCP Cloud Audit Logs
- Enable verbose audit logging
- Manage code versions with Git folders
- Restrict usage to trusted code repositories
- Provision infrastructure via infrastructure-as-code
- Manage code via CI/CD
- Control library installation
- Use models and data from only trusted or reputable sources
- Use data quality monitoring
- Implement DevSecOps processes
- Use tagging as part of your cost monitoring and charge-back strategy
- Use budgets to monitor account spending
- Use Organization Policies
Details are in the PDF referenced at the beginning of this article.
Additional Resources
- Review the Security and Trust Center to understand is how security built into every layer of the Databricks Data Intelligence Platform, and the shared responsibility model we operate under.
- Download and review the Databricks AI Security Framework (DASF) to understand how to mitigate AI security threats based on real-world attack scenarios