Canada Protected B
This page describes Canada Protected B compliance controls in Databricks.
Canada Protected B overview
CCCS Medium (Protected B) compliance refers to adherence to the Canadian Centre for Cyber Security’s requirements for handling and protecting sensitive government information classified as "Protected B." This standard outlines controls for safeguarding data against unauthorized access, ensuring confidentiality, integrity, and availability for medium-impact information.
Key points
- Designed for Canadian government workloads with medium sensitivity.
- Focuses on protecting data from compromise, loss, or unauthorized disclosure.
- Requires specific technical and organizational controls.
Enable Canada Protected B compliance controls
To configure your workspace to support processing of data regulated by the Canada Protected B standard, the workspace must have the compliance security profile enabled.
Serverless compute base environment version 5 or higher will soon be required for Canada Protected B workloads on AWS. Databricks recommends upgrading to base environment version 5 now. To select a base environment for notebooks, see Select a base environment. To configure the environment for jobs, see Configure environment for job tasks.
Only specific preview features are supported for processing regulated data. For details on the compliance security profile, supported preview features, and supported regions, see Compliance security profile.
You are solely responsible for verifying that sensitive information is never entered in customer-defined input fields, such as workspace names, compute resource names, tags, job names, job run names, network names, credential names, storage account names, and Git repository IDs or URLs. These fields might be stored, processed, or accessed outside the compliance boundary.
To enable Canada Protected B compliance controls, see Configure enhanced security and compliance settings.
Regional support for features
This table shows feature availability for the selected compliance standard across all supported Databricks regions. Some features may be listed as available before they are actually released.
Feature |
|
|---|---|
AI Functions - Classification | |
AI Functions - Document Parsing | ✓ |
AI Functions - Information Extraction | |
Anomaly Detection | |
Classic Compute | ✓ |
Clean Rooms | |
Data Classification | |
Databricks Apps | ✓ |
Databricks One | ✓ |
Default Storage | ✓ |
Genie Agent Mode | |
Genie Code | ✓ |
Genie Code Agent Mode | |
Genie Code Dashboard Agent | |
Genie Spaces | ✓ |
Knowledge Assistant | |
Lakebase Autoscaling | |
Lakeflow Connect - Confluence | |
Lakeflow Connect - Dynamics 365 | ✓ |
Lakeflow Connect - GA4 | |
Lakeflow Connect - Google Ads | |
Lakeflow Connect - HubSpot | |
Lakeflow Connect - Meta Ads | |
Lakeflow Connect - MySQL | ✓ |
Lakeflow Connect - NetSuite | |
Lakeflow Connect - PostgreSQL | ✓ |
Lakeflow Connect - SFTP | |
Lakeflow Connect - Salesforce | |
Lakeflow Connect - ServiceNow | |
Lakeflow Connect - SharePoint | ✓ |
Lakeflow Connect - TikTok Ads | |
Lakeflow Connect - Workday HCM | |
Lakeflow Connect - Workday Reports (RaaS) | |
Lakeflow Connect - Zendesk Support | |
Lakeflow Connect - Zerobus Ingest | |
Lakeflow Jobs | ✓ |
Lakeflow Pipelines Editor | |
Lakehouse Monitoring | |
MLflow on Databricks | ✓ |
Managed MCP Servers | ✓ |
Model Serving - AI Gateway | ✓ |
Model Serving - AI Guardrail | ✓ |
Model Serving - AI Playground | ✓ |
Model Serving - Custom Models | ✓ |
Model Serving - External Models | ✓ |
Model Serving - Foundation Models AI Function (ai_query) | ✓ |
Model Serving - Foundation Models Pay-Per-Token | ✓ |
Predictive Optimization | |
Serverless Jobs/Workflows/Notebooks | |
Serverless Lakeflow Pipelines | |
Serverless SQL warehouses | |
Serverless Workspace | |
Supervisor Agent | |
Vector Search (Standard) | |
Vector Search (Storage Optimized) |