Best practices for security, compliance, and privacy
The Databricks on AWS Security Best Practices and Threat Model can be downloaded as a PDF document from the Security & Trust Center. The sections in this article list the best practices that can be found in the PDF along the principles of this pillar.
1. Manage identity and access using least privilege
- Authenticate via single sign-on (SSO) at the account level
- Leverage multi-factor authentication
- Enable unified login and configure emergency access
- Use SCIM to synchronize users and groups
- Limit the number of admin users
- Enforce segregation of duties between administrative accounts
- Restrict workspace admins
- Manage access according to the principle of least privilege
- Use OAuth token authentication
- Enforce token management
- Restrict cluster creation rights
- Use compute policies
- Use service principals to run administrative tasks and production workloads
- Use compute that supports user isolation
- Store and use secrets securely
- Use a restricted cross-account IAM role
Details are in the PDF referenced at the beginning of this article.
2. Protect data in transit and at rest
- Centralise data governance with Unity Catalog
- Plan your data isolation model
- Avoid storing production data in DBFS
- Encrypt your S3 buckets and prevent public access
- Apply bucket policies
- Use S3 versioning
- Backup your S3 data
- Configure customer-managed keys for managed services
- Configure customer-managed keys for storage
- Use Delta Sharing
- Configure a Delta Sharing recipient token lifetime
- Additionally encrypt sensitive data at rest using Advanced Encryption Standard (AES)
- Leverage data exfiltration prevention settings within the workspace
- Use Clean Rooms to collaborate in a privacy-safe environment
Details are in the PDF referenced at the beginning of this article.
3. Secure your network and protect endpoints
- Use a customer-managed VPC
- Configure IP access lists
- Use AWS PrivateLink
- Implement network exfiltration protections
- Isolate sensitive workloads into different networks
- Configure a firewall for serverless compute access
- Restrict access to valuable codebases to only trusted networks
Details are in the PDF referenced at the beginning of this article.
4. Meet compliance and data privacy requirements
- Restart compute on a regular schedule
- Isolate sensitive workloads into different workspaces
- Assign Unity Catalog securables to specific workspaces
- Implement fine-grained access controls
- Apply tags
- Use lineage
- Use AWS Nitro instances
- Use Enhanced Security Monitoring or Compliance Security Profile
- Control & monitor workspace access for Databricks personnel
- Implement and test a Disaster Recovery strategy
Details are in the PDF referenced at the beginning of this article.
5. Monitor system security
- Leverage system tables
- Monitor system activities via AWS CloudTrail and other logs
- Enable verbose audit logging
- Manage code versions with Git folders
- Restrict usage to trusted code repositories
- Provision infrastructure via infrastructure-as-code
- Manage code via CI/CD
- Control library installation
- Use models and data from only trusted or reputable sources
- Implement DevSecOps processes
- Use lakehouse monitoring
- Use inference tables and AI Guardrails
- Use tagging as part of your cost monitoring and charge-back strategy
- Use budgets to monitor account spending
- Use AWS service quotas
Details are in the PDF referenced at the beginning of this article.
Additional Resources
- Review the Security and Trust Center to understand is how security built into every layer of the Databricks Data Intelligence Platform, and the shared responsibility model we operate under.
- Download and review the Databricks AI Security Framework (DASF) to understand how to mitigate AI security threats based on real-world attack scenarios