Troubleshoot common sharing issues in Delta Sharing
The following sections describe common errors that might occur when you try to access data in a share.
Resource limit exceeded errors
Issue: Your query on a shared table returns the error RESOURCE_LIMIT_EXCEEDED.
"RESOURCE_LIMIT_EXCEEDED","message":"The table metadata size exceeded limits""RESOURCE_LIMIT_EXCEEDED","message":"The number of files in the table to return exceeded limits, consider contact your provider to optimize the table"
Possible causes: There are limits on the number of files in metadata allowed for a shared table.
Recommended fix: To learn how to resolve either of these issues, see RESOURCE_LIMIT_EXCEEDED error when querying a Delta Sharing table in the Databricks Knowledge Base.
Vacuumed data file issue
Issue: You see an error message that throws a “404 The specified [path|key] does not exist” exception.
Spark error examples:
java.lang.Throwable: HTTP request failed with status: HTTP/1.1 404 The specified path does not exist.
or
HTTP request failed with status: HTTP/1.1 404 Not Found <?xml version="1.0" encoding="UTF-8"?>
<Error><Code>NoSuchKey</Code><Message>The specified key does not exist.</Message>
Possible cause: Typically you see this error because the data file corresponding to the pre-signed URL is vacuumed in the shared table and the data file belongs to a historical table version.
Workaround: Query the latest snapshot.
Shared materialization asset access issue
Issue: Your query on a shared view, materializaed view, or streaming table returns the error DS_MATERIALIZATION_QUERY_FAILED.
"DS_MATERIALIZATION_QUERY_FAILED": "The shared asset could not be materialized due to the asset not being accessible in the materialization workspace. Please ask data provider to contact :re[DB] support to override the materialization workspace."
Possible causes: The provider does not have read-write access to the asset they are trying to share.
Recommended fix: Contact your data provider to ensure they have read-write access to the shared data asset.
Network access error during data materialization
Issue: Your query on a shared data asset returns an error about accessing the data provider's cloud storage.
There was an issue accessing the data provider's cloud storage. Shared view materialization uses the Serverless compute of data provider's region to perform the materialization. Please contact the data provider to allowlist Serverless compute IPs of their corresponding region to access the view's dependent tables storage location.
Possible causes: The storage location for the materialized data has network restrictions (such as a firewall or private link) that prevent Databricks serverless compute from accessing it. When sharing views, materialized views, or streaming tables, the data is temporarily materialized on the provider's side. The materialization storage location is the asset's parent schema or catalog storage location.
Recommended fix: The data provider needs to allowlist serverless compute IPs of their corresponding region to access the view's dependent tables storage location. To configure your firewall, see Limit network egress for your workspace using a firewall.