Salesforce ingestion connector
Databricks Lakeflow Connect provides a built-in connector for ingesting data directly from the Salesforce Platform into Databricks. Data teams can easily build efficient, incremental pipelines at scale, and businesses can derive rich insights by unifying all their data and AI assets on the Data Intelligence Platform.
An organization might want to use Salesforce data to predict customer churn. The following video demonstrates how a retailer can do this by ingesting their customer order data, analyzing it, and then combining it with customer interactions across other channels for a holistic customer view.
What to know before you start
Topic | Why it matters |
|---|---|
The workflow depends on your Databricks user persona:
| |
The steps to create a connection depend on the authentication method you choose. For supported methods, see Authentication methods. | |
The steps to create a pipeline depend on the interface. | |
The pipeline schedule depends on your latency and cost requirements. | |
Depending on your ingestion needs, the pipeline might use configurations like history tracking, column selection, and row filtering. Supported configurations vary by connector. See Feature availability. |
Start ingesting from Salesforce
The following table provides an overview of the end-to-end Salesforce ingestion flow, based on user type:
User | Steps |
|---|---|
Admin | Either:
|
Non-admin | Use any supported interface to create a pipeline from an existing connection. See Ingest data from Salesforce. |