Skip to main content

Infosec Registered Assessors Program (IRAP)

This page describes IRAP compliance controls in Databricks.

IRAP Overview

IRAP is an Australian government initiative that certifies cloud service providers for handling government data. It involves independent assessment against the Australian Government Information Security Manual (ISM) controls.

Key points

  • Widely adopted in the Australian and New Zealand governments' public sectors.
  • Involves assessment by an accredited IRAP assessor.
  • Focuses on compliance with ISM security controls.

Enable IRAP compliance controls

To configure your workspace to support processing of data regulated by the IRAP standard, the workspace must have the compliance security profile enabled.

important

Serverless compute base environment version 5 or higher will soon be required for IRAP workloads on AWS. Databricks recommends upgrading to base environment version 5 now. To select a base environment for notebooks, see Select a base environment. To configure the environment for jobs, see Configure environment for job tasks.

Only specific preview features are supported for processing regulated data. For details on the compliance security profile, supported preview features, and supported regions, see Compliance security profile.

note

Graviton VM types do not enforce FIPS 140 encryption. You must ensure that FIPS-approved cryptography is used.

You are solely responsible for verifying that sensitive information is never entered in customer-defined input fields, such as workspace names, compute resource names, tags, job names, job run names, network names, credential names, storage account names, and Git repository IDs or URLs. These fields might be stored, processed, or accessed outside the compliance boundary.

To enable IRAP compliance controls, see Configure enhanced security and compliance settings.

Regional support for features

This table shows feature availability for the selected compliance standard across all supported Databricks regions. Some features may be listed as available before they are actually released.

Feature

ap-southeast-2

AI Functions - Classification

AI Functions - Document Parsing

AI Functions - Information Extraction

Anomaly Detection

Classic Compute

Clean Rooms

Data Classification

Databricks Apps

Databricks One

Default Storage

Genie Agent Mode

Genie Code

Genie Code Agent Mode

Genie Code Dashboard Agent

Genie Spaces

Knowledge Assistant

Lakebase Autoscaling

Lakeflow Connect - Confluence

Lakeflow Connect - Dynamics 365

Lakeflow Connect - GA4

Lakeflow Connect - Google Ads

Lakeflow Connect - HubSpot

Lakeflow Connect - Meta Ads

Lakeflow Connect - MySQL

Lakeflow Connect - NetSuite

Lakeflow Connect - PostgreSQL

Lakeflow Connect - SFTP

Lakeflow Connect - Salesforce

Lakeflow Connect - ServiceNow

Lakeflow Connect - SharePoint

Lakeflow Connect - TikTok Ads

Lakeflow Connect - Workday HCM

Lakeflow Connect - Workday Reports (RaaS)

Lakeflow Connect - Zendesk Support

Lakeflow Connect - Zerobus Ingest

Lakeflow Jobs

Lakeflow Pipelines Editor

Lakehouse Monitoring

MLflow on Databricks

Managed MCP Servers

Model Serving - AI Gateway

Model Serving - AI Guardrail

Model Serving - AI Playground

Model Serving - Custom Models

Model Serving - External Models

Model Serving - Foundation Models AI Function (ai_query)

Model Serving - Foundation Models Pay-Per-Token

Predictive Optimization

Serverless Jobs/Workflows/Notebooks

Serverless Lakeflow Pipelines

Serverless SQL warehouses

Serverless Workspace

Supervisor Agent

Vector Search (Standard)

Vector Search (Storage Optimized)