Skip to main content
Unlisted page
This page is unlisted. Search engines will not index it, and only users having a direct link can access it.

Default storage

Preview

This feature is in Private Preview and only available in serverless workspace or accounts enrolled in the private preview. To request access to the preview, please complete this form or contact your Databricks representative.

The Private Preview for default storage provides a fully managed storage location for Unity Catalog catalogs, which allows you to create managed tables and volumes without configuring storage credentials or external locations on Unity Catalog.

You can add a new catalog backed by default storage to an existing Unity Catalog-enabled workspace.

important

During the Private Preview, Databricks does not recommend storing production data or sensitive data in default storage or running any production workloads using default storage.

Enable default storage for your Databricks account

An account admin must enable default storage in the Databricks account console. This option is only available once Databricks has approved your request to join the Private Preview.

Have an account admin complete the following steps:

  1. Log in to the account console.

    note

    When using the provided link, users who are not account admins see a workspace select screen rather than the account console.

  2. Click Preview.

  3. Turn Default Storage to On.

important

The Default Storage option only appears if Databricks has enabled your account for this Private Preview. Default storage is enabled for all workspaces associated with your account, but additional requirements apply.

You must create a catalog with default storage to enable managed tables and managed volumes to store data with default storage.

Requirements

  • Default storage is only supported for Databricks on AWS.
  • You must use serverless compute to access default storage.
  • Only the following regions support default storage:
    • us-east-1
    • us-east-2
    • us-west-2
    • eu-west-1
  • You must be enrolled in the Private Preview for default storage to use this functionality.
  • Your workspace must have Unity Catalog enabled.
note

If enabling Unity Catalog for the first time, Databricks recommends deploying a metastore without specifying an S3 bucket for managed storage. Ignore the optional steps in the Unity Catalog enablement documentation.

Create a catalog with default storage

During Private Preview, you must use Catalog Explorer in the workspace UI to create a catalog with default storage. You cannot use the SQL command CREATE CATALOG catalog_name to create a catalog with default storage.

You must have CREATE CATALOG privileges to create a catalog with default storage. See Unity Catalog privileges and securable objects.

Complete the following steps to create a new catalog using default storage:

  1. Click Catalog icon Catalog in the sidebar. Catalog Explorer appears.
  2. Click Create catalog. The Create a new catalog dialog appears.
  3. Provide a Catalog name that is unique in your account.
  4. Select the option to Use default storage.
  5. Click Create.

Work with default storage

All interactions with default storage require serverless, Unity Catalog-enabled compute.

Resources backed by default storage use the same privilege model as other objects in Unity Catalog. You must have sufficient privileges to create, view, query, or modify data objects. See Unity Catalog privileges and securable objects.

By default, catalogs backed by default storage are visible in all workspaces in the same region. You can disable this behavior by binding the catalog to a single workspace. See Limit catalog access to specific workspaces.

You work with default storage by creating and interacting with managed tables and managed volumes backed by default storage. See Work with managed tables and What are Unity Catalog volumes?.

To interact with data objects stored in default storage, you can use Catalog Explorer, notebooks, the query editor, and dashboards.

The following are examples of tasks you can complete with default storage:

Limitations

During the Private Preview, Databricks does not recommend storing production data or sensitive data in default storage or running any production workloads using default storage. The following limitations apply:

  • Many regions are not supported. See Requirements.
  • Classic compute (any compute that is not serverless) cannot interact with data assets in default storage.
  • Delta Sharing supports sharing data assets within the same Databricks account, but not to other Databricks account or open Delta Sharing clients.
  • External readers and writers cannot access default storage.
  • Default storage is not compatible with advanced networking configurations or customer-managed keys.
  • You cannot create a new catalog on default storage using the CREATE CATALOG catalog_name SQL command.