Skip to main content

Workday Reports connector limitations

This article lists limitations and considerations for ingesting Workday reports using Databricks Lakeflow Connect.

General SaaS connector limitations

The limitations in this section apply to all SaaS connectors in Lakeflow Connect.

  • When you run a scheduled pipeline, alerts don't trigger immediately. Instead, they trigger when the next update runs.
  • When a source table is deleted, the destination table is not automatically deleted. You must delete the destination table manually. This behavior is not consistent with DLT behavior.
  • During source maintenance periods, Databricks might not be able to access your data.
  • If a source table name conflicts with an existing destination table name, the pipeline update fails.
  • Multi-destination pipeline support is API-only.
  • You can optionally rename a table that you ingest. If you rename a table in your pipeline, it becomes an API-only pipeline, and you can no longer edit the pipeline in the UI.
  • Column-level selection and deselection are API-only.
  • If you select a column after a pipeline has already started, the connector does not automatically backfill data for the new column. To ingest historical data, manually run a full refresh on the table.
  • Managed ingestion pipelines aren't supported for the following:
    • Workspaces in AWS GovCloud regions
    • Workspaces in Azure GovCloud regions
    • FedRAMP-compliant workspaces

Connector-specific limitations

The limitations in this section are specific to the Workday Reports connector.

Authentication

  • Databricks recommends using a Workday integrated system user (ISU), but this is not required.
  • Typically, a refresh token is created on behalf of an ISU. You can choose whether to allow the refresh token to expire:
    • If you set an expiration date, you must edit the connection when you reach that date.
    • If you don't set an expiration date, the refresh token can only expire if your organization reduces the access level of the ISU that's associated with the token.

Pipelines

  • The connector can only ingest reports with less than 2 GB of data or fewer than 1M records. Your organization's Workday API limits might be lower than this.
  • Full refresh is supported, but incremental refresh is not, so the connector ingests the full report every time the pipeline runs.
  • You can only create a Workday Reports ingestion pipeline using Databricks APIs. You can't create the pipeline in the Databricks UI.
  • The connector can't ingest reports with duplicate primary keys.
この記事は役に立ちましたか?