Standard compute overview
This page provides an overview of standard compute.
What is standard compute?
Standard compute is compute configured with standard access mode. Standard compute resources can be used by any user given permission to do so.
Standard compute is recommended for most workloads. Standard compute allows any number of users to attach and concurrently execute workloads on the same compute resource, providing cost savings and simplified compute management. Standard compute runs user code in full isolation with no access to lower-level resources.
Access mode selection
Access mode is configured when creating an all-purpose or job compute resource. The access mode setting is under the Advanced section in the compute UI and represented by data_security_mode
in the API.
By default in the UI, access mode is set to Auto, which means the access mode is automatically chosen for you based on your selected Databricks Runtime. Auto defaults to Standard unless a machine learning runtime or a Databricks Runtimes lower than 14.3 is selected, in which case Dedicated is used.
When to use standard compute
Standard compute is recommended for most workloads, including:
- General data engineering and ETL pipelines: Most data processing workloads
- Collaborative data science projects: Teams working together on analysis and model development
- Interactive data exploration: Ad-hoc analysis and notebook-based development
- Cost optimization: When you want to share compute resources across multiple users
Choose dedicated compute only for specialized workloads requiring privileged machine access, RDD APIs, distributed ML, GPUs, or R. For a list of standard compute limitations, see Standard compute requirements and limitations.
Language and runtime support
Standard compute has the following programming language support:
- Python: Full support for all Databricks Runtime versions
- SQL: Full support for all Databricks Runtime versions
- Scala: Supported on Databricks Runtime 13.3 LTS and above with Unity Catalog
- R: Not supported on standard compute
Lakeguard for user isolation
Standard compute uses Databricks Lakeguard to provide secure user isolation and data governance. Lakeguard employs advanced code isolation techniques that separate user code from the underlying Spark infrastructure.
For more information, see How does Databricks enforce user isolation?.