Configure Delta storage credentials
Note
To configure Delta storage credentials, see Where’s my data? or What is Unity Catalog?. Databricks no longer recommends passing storage credentials through DataFrame options as described in this article.
Databricks stores data for Delta Lake tables in cloud object storage. Configuring access to cloud object storage requires permissions within the cloud account that contains your storage account. See Interact with external data on Databricks.
Pass storage credentials as DataFrame options
Delta Lake supports specifying storage credentials as options for DataFrameReader and DataFrameWriter. You might use this if you need to interact with data in several storage accounts governed by different access keys.
Note
This feature is available in Databricks Runtime 10.1 and above.
For example, you can pass your storage credentials through DataFrame options:
df1 = (spark.read
.option("fs.s3a.access.key", "<access-key-1>")
.option("fs.s3a.secret.key", "<secret-key-1>")
.read("...")
)
df2 = (spark.read
.option("fs.s3a.access.key", "<access-key-1>")
.option("fs.s3a.secret.key", "<secret-key-2>")
.read("...")
)
(df1.union(df2).write
.mode("overwrite")
.option("fs.s3a.access.key", "<access-key-3>")
.option("fs.s3a.secret.key", "<secret-key-3>")
.save("...")
)
val df1 = spark.read
.option("fs.s3a.access.key", "<access-key-1>")
.option("fs.s3a.secret.key", "<secret-key-1>")
.read("...")
val df2 = spark.read
.option("fs.s3a.access.key", "<access-key-2>")
.option("fs.s3a.secret.key", "<secret-key-2>")
.read("...")
df1.union(df2).write
.mode("overwrite")
.option("fs.s3a.access.key", "<access-key-3>")
.option("fs.s3a.secret.key", "<secret-key-3>")
.save("...")