Query federation for Redshift in Databricks SQL


This feature is Experimental and has no support for Unity Catalog. Experimental features are provided as-is and are not supported by Databricks through customer technical support channels.

Databricks SQL supports read-only query federation to Redshift on serverless and pro SQL warehouses. For details on configuring Redshift S3 credentials, see Query Amazon Redshift with Databricks.

Connecting to Redshift with Databricks SQL

You configure connections to Redshift at the table level. You can use secrets to store and access text credentials without displaying them in plaintext. See the following example:

DROP TABLE IF EXISTS redshift_table;
CREATE TABLE redshift_table
USING redshift
  dbtable '<table-name>',
  tempdir 's3a://<bucket>/<directory-path>',
  url 'jdbc:redshift://<database-host-url>',
  user secret('redshift_creds', 'my_username'),
  password secret('redshift_creds', 'my_password'),
  forward_spark_s3_credentials 'true'