Troubleshoot common sharing issues in Delta Sharing

The following sections describe common errors that might occur when you try to access data in a share.

Resource limit exceeded errors

Issue: Your query on a shared table returns the error RESOURCE_LIMIT_EXCEEDED.

You may see either of these errors:

  • "RESOURCE_LIMIT_EXCEEDED","message":"The table metadata size exceeded limits"

  • "RESOURCE_LIMIT_EXCEEDED","message":"The number of files in the table to return exceeded limits, consider contact your provider to optimize the table"

Possible causes: There are limits on the number of files in metadata allowed for a shared table.

Recommended fix: To learn how to resolve either of these issues, see RESOURCE_LIMIT_EXCEEDED error when querying a Delta Sharing table in the Databricks Knowledge Base.

AWS S3 bucket name issue

Issue: You see an error message that throws a file not found or certificate exception.

Spark error example:

FileReadException: Error while reading file delta-sharing:/%252Ftmp%252Fexample.share%2523example.tpc_ds.example/XXXXXXXXXXXXX/XXXXXXXX.

Caused by: SSLPeerUnverifiedException: Certificate for - <[workspace name]> doesn't match any of the subject alternative names [, *…]:

Pandas error example:


Power BI error example:

DataSource.Error: The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure channel.

Possible cause: Typically you see this error because your bucket name uses dot or period notation (for example, This is an AWS limitation. See the AWS bucket naming rules.

You may get this error even if your bucket name is formatted correctly. For example, you may encounter an SSL error (SSLCertVerificationError) when you execute code on PyCharm.

Recommended fix: If your bucket name uses invalid AWS bucket naming notation, use a different bucket for Unity Catalog and Delta Sharing.

If your bucket uses valid naming conventions and you still face a FileNotFoundError in Python, enable debug logging to help isolate the issue:

import logging

Vacuumed data file issue

Issue: You see an error message that throws a “404 The specified [path|key] does not exist” exception.

Spark error examples:

java.lang.Throwable: HTTP request failed with status: HTTP/1.1 404 The specified path does not exist.


HTTP request failed with status: HTTP/1.1 404 Not Found <?xml version="1.0" encoding="UTF-8"?>
<Error><Code>NoSuchKey</Code><Message>The specified key does not exist.</Message>

Possible cause: Typically you see this error because the data file corresponding to the pre-signed URL is vacuumed in the shared table and the data file belongs to a historical table version.

Workaround: Query the latest snapshot.