dbt platform task for jobs
This feature is in Beta.
Use the dbt platform task to orchestrate and monitor existing dbt platform jobs directly from Databricks. This page explains how to select and trigger dbt jobs, set auto-retry options for failures, and monitor runs.
Differences between dbt platform and dbt tasks
Jobs offers two task types for dbt projects. Choose the right one based on where your dbt project is managed:
dbt platform task: Use this to orchestrate pre-existing dbt platform jobs. It connects to the dbt platform API and triggers a run there. Choose this if you want to centralize orchestration in Databricks while retaining all dbt platform benefits, such as monitoring and scheduling.
dbt task: Use this to run dbt core projects on a Databricks cluster with code from Git. Choose this if you need full control over the execution environment and prefer to manage dependencies entirely within Databricks. See dbt task for jobs.
Prerequisites
To use the dbt platform task, you must meet the following prerequisites:
- A workspace admin must enable the preview. See Manage Databricks previews.
 - You must have 
CREATE CONNECTIONprivileges on the Unity Catalog metastore in your workspace. - Access to an existing dbt project with a defined job in the dbt platform. To learn more, see Jobs in the dbt platform in the dbt documentation.
 - Permissions to generate a service token in the dbt platform. To learn more, see Service account tokens.
 
For security and operational stability, Databricks recommends generating a service account token, not a personal access token. Service account tokens aren't tied to an individual user and can be easily scoped to provide the minimum necessary permissions.
Gather dbt platform details
To integrate dbt with Databricks, you need the following three details:
- Your dbt platform Account ID.
 - An API key generated in the dbt platform.
 - Your dbt platform deployment host URL.
 
The following sections describe how to find this required information.
Get your account ID:
To retrieve your account ID:
- Log in to the dbt platform.
 - Navigate to Settings > Account Settings.
 - Get the Account ID from the URL suffix, which is in the following format: 
https://cloud.getdbt.com/settings/accounts/{account_id}. 
Get your API key
To retrieve your API key:
- Log in to the dbt platform.
 - Navigate to Settings > Profile Setting > Your Profile > Access API > API Key.
 
Host URL
Your host URL depends on your location and tenancy. See Access, Regions, & IP addresses in the dbt documentation to find the URL for your region.
Identify your region and tenancy (Multi-tenant or Cell-based). Use the Access URL column to get your host URL.
Tenancy Type  | Region Example  | Host URL Example  | 
|---|---|---|
Multi-tenant  | North America  | 
  | 
Cell-based  | North America (  | 
  | 
dbt platform connection setup
Use the following steps to set up your dbt platform connection in Databricks.
- Click 
Catalog in the sidebar.
 - Click 
the plus icon in the schema browser. Then, click Create a connection. The Set up connection form opens.
 - Enter the following information, then click Next:
- In Connection name, enter a name.
 - For Connection type, choose dbt platform.
 
 - Enter your dbt platform host URL in the Host text field. Do not include a trailing slash (
/). - Enter your dbt platform Account ID and the API Token you collected in a previous step.
 - Click Create connection to confirm the connection details.
 - (Optional) Grant other users privileges to use the connection:
- Choose the user IDs and groups you want to grant privileges to in the Principals drop-down menu.
 - Select the privileges you want to grant.
 - Click Confirm.
 
 
Create a new job with a dbt platform task
- 
In your workspace, click
Jobs & Pipelines in the sidebar.
 - 
Click Create, then Job. The new job is automatically named with an associated timestamp.
 - 
(Optional) Click the job name and enter a new name to edit it.
The Tasks tab displays with the empty task pane.
note- If the Lakeflow Jobs UI is ON, click Add another task type. Search for dbt platform and click the tile to select it.
 - If the Lakeflow Jobs UI is OFF, use the Type drop-down menu to select dbt platform.
 
 - 
Enter a Task name.
 - 
Use the dbt platform connection drop-down menu to select the connection created previously.
 - 
Use the dbt platform job drop-down menu to select the dbt platform job that you want to orchestrate.
 - 
Click Save task.
 - 
(Optional) Click Run now to manually test your job.
 
Set a schedule or trigger
You can configure jobs to automatically trigger according to a time-based schedule or the arrival of new data. To learn more about the available options, see Automating jobs with schedules and triggers.
Continuous triggers are not supported for dbt platform jobs.
Monitor runs
You can monitor Lakeflow jobs in the Databricks UI. For dbt platform jobs, you can also open a link that points to job run details in the dbt platform.
To monitor a run:
- 
Click Jobs & Pipelines in the workspace sidebar.
 - 
(Optional) Select the Jobs and Owned by me filters.
 - 
Click your job's Name link.
The Runs tab appears, showing matrix and list views of active and completed runs.
 - 
Click the link for the run in the Start time column in the runs list view. The dbt platform job status opens.
 - 
Click View in dbt to see the job run details in the dbt platform.