Skip to main content

Microsoft Dynamics 365 ingestion connector limitations

Preview

This feature is in Public Preview.

This page describes limitations and restrictions for the Microsoft Dynamics 365 connector in Lakeflow Connect.

General SaaS connector limitations

The Dynamics 365 connector shares limitations common to all SaaS connectors in Lakeflow Connect:

  • When you run a scheduled pipeline, alerts don't trigger immediately. Instead, they trigger when the next update runs.
  • When a source table is deleted, the destination table is not automatically deleted. You must delete the destination table manually. This behavior is not consistent with Lakeflow Spark Declarative Pipelines behavior.
  • During source maintenance periods, Databricks might not be able to access your data.
  • If a source table name conflicts with an existing destination table name, the pipeline update fails.
  • Multi-destination pipeline support is API-only.
  • You can optionally rename a table that you ingest. If you rename a table in your pipeline, it becomes an API-only pipeline, and you can no longer edit the pipeline in the UI.
  • Column-level selection and deselection are API-only.
  • If you select a column after a pipeline has already started, the connector does not automatically backfill data for the new column. To ingest historical data, manually run a full refresh on the table.
  • Databricks can't ingest two or more tables with the same name in the same pipeline, even if they come from different source schemas.
  • The source system assumes that the cursor columns are monotonically increasing.
  • With SCD type 1 enabled, deletes don't produce an explicit delete event in the change data feed. For auditable deletions, use SCD type 2 if the connector supports it. For details, see Example: SCD type 1 and SCD type 2 processing with CDF source data.
  • The connector ingests raw data without transformations. Use downstream Lakeflow Spark Declarative Pipelines pipelines for transformations.

The Dynamics 365 connector requires Azure Synapse Link for Dataverse to run continuously:

  • Continuous operation: Synapse Link must export data without interruption. If Synapse Link stops, the connector can't capture changes until it resumes.
  • Export latency: Changes appear in Azure Data Lake Storage (ADLS) Gen2 after Synapse Link's export interval (typically 5-15 minutes). The Synapse Link architecture inherently includes this latency.
  • Retention policies: Configure appropriate retention policies for your ADLS Gen2 storage. If you delete Synapse Link exports before ingestion, you must perform a full refresh.
important

If Azure Synapse Link stops for an extended period, you might miss changes. Monitor Synapse Link health in the Power Apps maker portal and set up alerts for export failures.

Schema evolution

At this time, the Dynamics 365 connector does not support automated schema evolution.

Virtual entity schema evolution

Special considerations for Finance & Operations virtual entities:

Virtual entities in F&O require additional handling for schema changes:

  • Virtual entity updates: When F&O virtual entities are updated, you must refresh the virtual entity configuration in Dataverse.
  • Synchronization delay: Allow up to 15 minutes for virtual entity changes to appear in Dataverse schema discovery.
  • Full refresh required: Virtual entity schema changes require a full refresh of affected tables.

Action required: Monitor F&O virtual entity updates and coordinate full refreshes with your D365 administrator.

Incremental ingestion

The Dynamics 365 connector's incremental ingestion has these limitations:

  • Requires VersionNumber: Synapse Link must export changelogs with the VersionNumber field. If VersionNumber is missing, you must use full refresh mode.
  • Folder-based processing: The connector processes Synapse Link export folders in chronological order. If you delete folders or folders are missing, the connector can't recover without a full refresh.
  • No backfill: If Synapse Link misses changes due to downtime, those changes aren't captured unless you perform a full refresh.
  • Delete detection: The connector detects deletes only if Synapse Link exports delete records in changelogs. Some D365 configurations don't export deletes.
tip

Verify your Synapse Link configuration exports changelogs with VersionNumber before creating pipelines. See Configure data source for Microsoft Dynamics 365 ingestion.

Attachments and files

The Dynamics 365 connector ingests attachment metadata but not file contents:

  • Metadata only: The connector ingests attachment tables (for example, annotation, attachment) with file names, sizes, MIME types, and record associations.
  • No binary data: The connector doesn't ingest file contents. You must download files separately using the Dynamics 365 Web API or Power Automate.
  • Storage limitation: Synapse Link exports table data, not binary files stored in D365.

Workaround: Use attachment metadata to identify required files, then download them using D365 APIs and store in a volume or ADLS Gen2.

Data type support

The Dynamics 365 connector supports most Dataverse data types but has limitations for complex types:

Supported with full fidelity

  • String (single-line and multi-line text)
  • Integer (whole number)
  • Decimal (decimal number)
  • Boolean (yes/no)
  • DateTime (date and time)
  • Money (currency)
  • Lookup (foreign key references, stored as GUIDs)

Supported with limitations

  • Option sets (picklists): The connector ingests these as integer values. To map integers to labels, join with the OptionSetMetadata table or maintain a reference table.
  • Multi-select option sets: The connector ingests these as comma-separated integer strings. Parse the string to extract individual values.
  • Lookup fields: The connector ingests these as GUIDs. To get related record data, join with the referenced table.

Not supported

  • Complex JSON objects: Some custom Dataverse data types export as JSON. The connector ingests these as strings. Parse the JSON in downstream transformations.
  • Images: Image metadata is ingested, but image data must be downloaded separately.

See Microsoft Dynamics 365 connector reference for a complete data type mapping table.

Performance considerations

The Dynamics 365 connector's performance depends on several factors:

Initial sync time

  • Data volume: Tables with millions of records take longer to sync initially.
  • Synapse Link export: Initial Synapse Link export can take hours for large datasets.
  • Network throughput: Transfer speed between Azure and Databricks affects sync time.

Recommendation: Start with a small subset of tables to validate the setup, then add more tables incrementally.

Incremental sync time

  • Change volume: High-frequency changes (for example, thousands of updates per minute) increase processing time.
  • Folder count: Synapse Link creates folders at regular intervals. More folders increase processing overhead.
  • Changelog size: Large changelogs take longer to process.

Recommendation: Schedule pipeline runs based on your change volume. For high-frequency changes, run pipelines more often to keep changelogs small.

Pipeline limitations

The Dynamics 365 connector has these pipeline-specific limitations:

  • Maximum tables per pipeline: 250 tables. For large D365 environments, create multiple pipelines.
  • No UI authoring: At this time, you must create pipelines using the CLI, Databricks Asset Bundles, or notebooks.
note

The 250 table limit is per pipeline, not per connection. To ingest more than 250 tables, create multiple pipelines using the same connection.

API and SDK limitations

The Dynamics 365 connector uses Dataverse and Azure Storage APIs with these limitations:

  • API version compatibility: The connector is tested with Dataverse API v9.2 and later. Older versions might not be supported.
  • Azure Storage API: The connector uses Azure Storage REST API version 2021-08-06. Make sure that your storage account supports this version.
  • Synapse Link version: The connector requires Azure Synapse Link for Dataverse version 1.0 or later.

Recommendation: Keep your D365, Dataverse, and Azure services updated to the latest versions for best compatibility.

Known issues

  • Virtual entity synchronization delays: Virtual entities sometimes take longer than 15 minutes to synchronize. If tables don't appear, wait up to 30 minutes and retry.
  • Synapse Link export failures: Synapse Link occasionally fails to export specific tables. Check Synapse Link logs in Power Apps for errors.
  • Changelog VersionNumber gaps: In rare cases, changelogs might have gaps in VersionNumber sequences. This doesn't affect data integrity but might cause warnings in pipeline logs.

Report issues to Databricks support with pipeline IDs and timestamps for investigation.