Skip to main content

Name a destination table

Applies to: check marked yes UI-based pipeline authoring check marked yes API-based pipeline authoring check marked yes SaaS connectors check marked yes Database connectors

By default, a destination table created during Lakeflow Connect managed ingestion is given the name of the corresponding source table. However, you can optionally specify a different name for the destination table. For example, if you ingest an object into two tables in the same schema, you must specify a unique name for one of the tables to differentiate between them. Managed ingestion connectors don't support duplicate destination table names in the same schema.

Name a destination table in the UI

You can name a destination table when you create or edit your managed ingestion pipeline in the Databricks UI.

On the Source page of the data ingestion wizard, enter a name for Destination table.

Optional destination table name setting in the Databricks UI

Name a destination table using the API

You can name a destination table when you create or edit your managed ingestion pipeline using Databricks Asset Bundles, notebooks, or the Databricks CLI. To do this, set the destination_table parameter. For example:

Examples: Google Analytics

YAML
resources:
pipelines:
pipeline_ga4:
name: <pipeline-name>
catalog: <target-catalog> # Location of the pipeline event log
schema: <target-schema> # Location of the pipeline event log
ingestion_definition:
connection_name: <connection>
objects:
- table:
source_url: <project-id>
source_schema: <property-name>
destination_catalog: <target-catalog>
destination_schema: <target-schema>
destination_table: <custom-target-table-name> # Specify destination table name

Examples: Salesforce

YAML
resources:
pipelines:
pipeline_sfdc:
name: <pipeline-name>
catalog: <target-catalog> # Location of the pipeline event log
schema: <target-schema> # Location of the pipeline event log
ingestion_definition:
connection_name: <connection>
objects:
- table:
source_schema: <source-schema>
source_table: <source-table>
destination_catalog: <target-catalog>
destination_schema: <target-schema>
destination_table: <custom-target-table-name> # Specify destination table name

Examples: Workday

YAML
resources:
pipelines:
pipeline_workday:
name: <pipeline>
catalog: <target-catalog> # Location of the pipeline event log
schema: <target-schema> # Location of the pipeline event log
ingestion_definition:
connection_name: <connection>
objects:
- report:
source_url: <report-url>
destination_catalog: <target-catalog>
destination_schema: <target-schema>
destination_table: <custom-target-table-name> # Specify destination table name