Connect to external data sources
This article explains how to connect your SAP Databricks workspace to non-SAP data sources.
SAP Databricks supports reading data from some cloud storage buckets, external APIs, and Delta Sharing shares.
Connect to an S3 bucket
Admins in SAP Databricks accounts deployed on AWS can create external locations to connect their workspace to an AWS S3 bucket. For instructions on creating an external location, see Connect to an S3 bucket.
To prevent data loss, SAP Databricks requires that external locations be read-only.
Connect to a GCS bucket
Admins in SAP Databricks accounts deployed on GCP can create external locations to connect their workspace to a Google Cloud Storage bucket location. For instructions on creating an external location, see Connect to a Google Cloud Storage bucket.
To prevent data loss, SAP Databricks requires that external locations be read-only.
Receive a Delta Share
SAP Databricks accounts can receive Delta Sharing shares from workspaces both inside and outside their account. To learn about receiving and reading shared data, see Access data shared with you using Delta Sharing.
SAP Databricks accounts can be the recipients of Delta Shares, but cannot initiate shares to locations outside of SAP.