FedRAMP Moderate
This page describes FedRAMP Moderate compliance controls in Databricks.
FedRAMP Moderate overview
FedRAMP Moderate is a U.S. federal program that standardizes security assessment, authorization, and continuous monitoring for cloud products and services at the moderate impact level. It enables federal agencies to use cloud technologies while ensuring the protection of federal data.
Key points
- Applies to cloud services handling Controlled Unclassified Information (CUI).
- Requires compliance with NIST 800-53 moderate baseline controls.
- Emphasizes access control, incident response, continuous monitoring, and encryption.
- Databricks is a FedRAMP® Authorized Cloud Service Offering (CSO) at the moderate impact level in the AWS US East-1, US-East-2, US-West-1, US West-2 (commercial) regions.
- US Government agencies can access the Databricks on AWS FedRAMP® package on OMB Max by submitting a Package Access Request Form and submitting it to
package-access@fedramp.gov. - Additional information regarding Databricks and FedRAMP® compliance is located on the Databricks Security and Trust Center.
Enable FedRAMP Moderate compliance controls
To configure your workspace to support processing of data regulated by the FedRAMP Moderate standard, the workspace must have the compliance security profile enabled. Only specific preview features are supported for processing regulated data. For details on the compliance security profile, supported preview features, and supported regions see Compliance security profile.
You are solely responsible for verifying that sensitive information is never entered in customer-defined input fields, such as workspace names, compute resource names, tags, job names, job run names, network names, credential names, storage account names, and Git repository IDs or URLs. These fields might be stored, processed, or accessed outside the compliance boundary.
To enable FedRAMP Moderate compliance controls, see Configure enhanced security and compliance settings.
Serverless compute base environment version 5 or higher will soon be required for FedRAMP Moderate workloads on AWS. Databricks recommends upgrading to base environment version 5 now. To select a base environment for notebooks, see Select a base environment. To configure the environment for jobs, see Configure environment for job tasks.
Regional support for features
Feature |
|
|
|
|
|---|---|---|---|---|
AI Diagnose | ✓ | ✓ | ✓ | ✓ |
Agent Bricks - Knowledge Assistant | ||||
Agent Bricks - Supervisor Agent | ||||
Agent Bricks - ai_parse_document | ✓ | ✓ | ||
Agent Bricks - ai_query (Batch Inference) | ✓ | ✓ | ✓ | ✓ |
Agent Framework: On-Behalf-Of-User Authorization | ✓ | ✓ | ✓ | ✓ |
Anomaly Detection | ✓ | ✓ | ||
Attribute Based Access Control | ✓ | ✓ | ✓ | ✓ |
Classic Compute | ✓ | ✓ | ✓ | ✓ |
Clean Rooms | ✓ | ✓ | ||
Cluster Log Delivery to UC Volumes | ✓ | ✓ | ✓ | ✓ |
Cross-Platform View Sharing | ✓ | ✓ | ✓ | ✓ |
Custom JDBC on UC Compute | ✓ | ✓ | ✓ | ✓ |
Data Classification | ||||
Data Room (Govcloud) | ||||
Data Science Agent | ||||
Databricks Apps | ✓ | ✓ | ✓ | ✓ |
Databricks Apps - App Spaces | ✓ | ✓ | ✓ | ✓ |
Databricks Apps - Configure App Compute Size | ✓ | ✓ | ✓ | ✓ |
Databricks One | ✓ | ✓ | ✓ | ✓ |
Databricks SQL alerts | ✓ | ✓ | ✓ | ✓ |
Databricks SQL alerts job task | ✓ | ✓ | ✓ | ✓ |
Default Python package repositories in Spark Declarative Pipelines | ✓ | ✓ | ✓ | ✓ |
Default Python repo in clusters (API) | ✓ | ✓ | ✓ | ✓ |
Default Python repo in clusters (UI) | ✓ | ✓ | ✓ | ✓ |
Default Storage | ✓ | ✓ | ✓ | ✓ |
Default warehouse setting | ✓ | ✓ | ✓ | ✓ |
Enable Extended Models (Qwen) | ✓ | ✓ | ✓ | ✓ |
Enhanced Python UDFs in Unity Catalog | ✓ | ✓ | ✓ | ✓ |
EventBridge support for file events | ✓ | ✓ | ✓ | ✓ |
Excel File Format Support | ✓ | ✓ | ✓ | ✓ |
Exclusive Access | ✓ | ✓ | ✓ | ✓ |
Expiring personal access token notifications | ✓ | ✓ | ✓ | ✓ |
Focused notebook & file editor for Git folders | ✓ | ✓ | ✓ | ✓ |
Genie Code | ✓ | ✓ | ✓ | ✓ |
Genie Data Sampling | ✓ | ✓ | ✓ | ✓ |
Genie Research Agent | ✓ | ✓ | ||
Genie Spaces | ✓ | ✓ | ✓ | ✓ |
Git CLI support for Git folders | ✓ | ✓ | ✓ | ✓ |
Lakebase Autoscaling | ||||
Lakebase Provisioned | ||||
Lakeflow Connect for Confluence | ✓ | ✓ | ||
Lakeflow Connect for Dynamics 365 | ✓ | ✓ | ✓ | ✓ |
Lakeflow Connect for Google Ads | ✓ | ✓ | ||
Lakeflow Connect for Google Drive | ✓ | ✓ | ✓ | ✓ |
Lakeflow Connect for HubSpot | ✓ | ✓ | ||
Lakeflow Connect for Jira | ✓ | ✓ | ||
Lakeflow Connect for Meta Ads | ✓ | ✓ | ||
Lakeflow Connect for MySQL | ✓ | ✓ | ✓ | ✓ |
Lakeflow Connect for PostgreSQL | ✓ | ✓ | ✓ | ✓ |
Lakeflow Connect for SFTP | ||||
Lakeflow Connect for SharePoint | ✓ | ✓ | ✓ | ✓ |
Lakeflow Connect for TikTok Ads | ✓ | ✓ | ||
Lakeflow Connect for Zendesk Support | ✓ | ✓ | ||
Lakeflow Designer | ✓ | ✓ | ✓ | ✓ |
Lakeflow Jobs | ✓ | ✓ | ✓ | ✓ |
Lakeflow Pipelines Editor | ✓ | ✓ | ||
Lakeflow Query Based Connectors | ✓ | ✓ | ||
Lakehouse Monitoring | ✓ | ✓ | ||
MLflow on Databricks | ✓ | ✓ | ✓ | ✓ |
Managed MCP Servers | ✓ | ✓ | ✓ | ✓ |
Model Serving - AI Gateway | ✓ | ✓ | ✓ | ✓ |
Model Serving - AI Guardrail | ✓ | ✓ | ✓ | ✓ |
Model Serving - AI Playground | ✓ | ✓ | ✓ | ✓ |
Model Serving - Custom CPU/GPU model (ST) | ✓ | ✓ | ✓ | ✓ |
Model Serving - External Models | ✓ | ✓ | ✓ | ✓ |
Model Serving - Foundation Models AI Function | ✓ | ✓ | ✓ | ✓ |
Model Serving - Foundation Models Pay-Per-Token | ✓ | ✓ | ✓ | ✓ |
Model update job triggers | ✓ | ✓ | ✓ | ✓ |
Models in Unity Catalog: Deployment Jobs | ✓ | ✓ | ✓ | ✓ |
Multiple Git Credentials | ✓ | ✓ | ✓ | ✓ |
Network Accept Logs | ✓ | ✓ | ||
New compute policy form | ✓ | ✓ | ✓ | ✓ |
Outbound (serverless) GCP Private Link | ||||
Power BI task type | ✓ | ✓ | ✓ | ✓ |
Predictive Optimization | ✓ | ✓ | ||
Production Monitoring for MLflow | ✓ | ✓ | ||
Remote query table-valued function | ✓ | ✓ | ✓ | ✓ |
Sample Data Exploration with Assistant | ✓ | ✓ | ✓ | ✓ |
Scala and Java UDFs in Unity Catalog | ✓ | ✓ | ✓ | ✓ |
Scoped personal access tokens | ✓ | ✓ | ✓ | ✓ |
Serverless Forecast Python SDK | ✓ | ✓ | ✓ | ✓ |
Serverless JARs | ✓ | ✓ | ||
Serverless Jobs/Workflows/Notebooks | ✓ | ✓ | ||
Serverless Lakeflow Pipelines | ✓ | ✓ | ||
Serverless Private Git | ✓ | ✓ | ||
Serverless SQL warehouses | ✓ | ✓ | ||
Serverless Workspace | ✓ | ✓ | ||
Sharing To Iceberg Clients | ✓ | ✓ | ✓ | ✓ |
Transactions | ✓ | ✓ | ✓ | ✓ |
Unified Runs List | ✓ | ✓ | ✓ | ✓ |
Unity Catalog Disaster Recovery | ||||
Unity Catalog Secrets | ✓ | ✓ | ✓ | ✓ |
Vector Search (Standard) | ✓ | ✓ | ||
Vector Search (Storage Optimized) | ||||
Vector Search Reranker | ||||
Workspace base environments | ✓ | ✓ |
:::