Source
yaml
id: airbyte-cloud-dbt-cloud
namespace: company.team
tasks:
- id: data_ingestion
type: io.kestra.plugin.core.flow.Parallel
tasks:
- id: salesforce
type: io.kestra.plugin.airbyte.cloud.jobs.Sync
connectionId: e3b1ce92-547c-436f-b1e8-23b6936c12ab
- id: google_analytics
type: io.kestra.plugin.airbyte.cloud.jobs.Sync
connectionId: e3b1ce92-547c-436f-b1e8-23b6936c12cd
- id: facebook_ads
type: io.kestra.plugin.airbyte.cloud.jobs.Sync
connectionId: e3b1ce92-547c-436f-b1e8-23b6936c12ef
- id: dbt_cloud_job
type: io.kestra.plugin.dbt.cloud.TriggerRun
jobId: "396284"
accountId: "{{ secret('DBT_CLOUD_ACCOUNT_ID') }}"
token: "{{ secret('DBT_CLOUD_API_TOKEN') }}"
wait: true
pluginDefaults:
- type: io.kestra.plugin.airbyte.cloud.jobs.Sync
values:
token: "{{ secret('AIRBYTE_CLOUD_API_TOKEN') }}"
description: |
This Kestra flow orchestrates a modern ELT pipeline by combining **Airbyte Cloud** for data ingestion and **dbt Cloud** for transformations.
It enables teams to automatically extract data from multiple SaaS sources, centralize it in the warehouse, and transform it into analytics-ready datasets.
This flow is suitable for production-grade analytics pipelines, enabling **near real-time ingestion** from SaaS tools and **trusted transformations** via dbt Cloud.
It can be scheduled, triggered via upstream dependencies, or launched on demand to support reporting and analytics use cases.
About this blueprint
dbt Airbyte Data Pipeline ELT SaaS
This Kestra flow orchestrates an end-to-end ELT pipeline using Airbyte Cloud and dbt Cloud:
- Runs three Airbyte Cloud syncs in parallel (Salesforce, Google Analytics, Facebook Ads) to ingest SaaS data.
- Executes a dbt Cloud job to transform ingested data into analytics-ready tables.
- Leverages
pluginDefaults
for simplified, secure authentication with Airbyte Cloud.
Configuration:
- Add secrets for
AIRBYTE_CLOUD_API_TOKEN
,DBT_CLOUD_ACCOUNT_ID
, andDBT_CLOUD_API_TOKEN
. - Replace Airbyte
connectionId
s and dbtjobId
with those relevant to your environment. - Schedule the flow or trigger it on demand to support analytics and reporting pipelines.
This automation ensures reliable ingestion and transformation of data across multiple SaaS sources, empowering teams with up-to-date, modeled data for decision-making.