Byte Ebi's Logo

Byte Ebi 🍤

A Bit everyday A Byte every week

Triggering Airflow DAG from Another

Use operator to easily trigger downstream DAGs in Airflow

Ray

When building dependencies between DAGs, you don’t always have to rely on RESTful APIs.
Airflow also provides a built-in Operator for this purpose.

Introduction

Using the built-in TriggerDagRunOperator instead of calling the REST API comes with a few advantages:

  1. No need to pass an Authorization Token when triggering DAGs within the same Airflow environment.
  2. Built-in support for retry, timeout, dependency management, and error handling, without worrying about network transmission issues.
  3. A clearer and more maintainable representation of upstream and downstream DAG relationships.

This Operator is available in both Airflow 2 and the latest Airflow 3 , though the way you import it differs slightly.

Example

The following example uses Airflow 2. Here, dag_a triggers dag_b.
To make it closer to how you would use a RESTful API, we’ll also demonstrate passing two variables between DAGs.

dag_a.py

First, import TriggerDagRunOperator and define the variables you want to pass through conf.
In this example, we’ll send name and age.

With the REST API, you would normally put these values in the request body named conf. Here, you simply specify the downstream DAG via trigger_dag_id, in this case is dag_b.

Then, include the variables you want to send in the Operator’s conf argument:

# dag_a.py

import logging
from datetime import datetime

from airflow import DAG
from airflow.operators.trigger_dagrun import TriggerDagRunOperator

logger = logging.getLogger(__name__)


default_args = {
    "owner": "airflow",
    "depends_on_past": False,
    "start_date": datetime(2025, 7, 2),
}

with DAG(
    "dag_a",
    default_args=default_args,
    schedule_interval=None,
    dag_display_name="DAG A",
    tags=["test"],
) as dag:
    conf = {
        "name": "John Doe",
        "age": 20,
    }

    call_dag_b = TriggerDagRunOperator(
        task_id="call_dag_b",
        trigger_dag_id="dag_b",
        conf=conf,
    )

    (call_dag_b)

dag_b.py

This downstream DAG receives the two parameters: name and age and then logs them:

# dag_b.py

import logging
from datetime import datetime

from airflow import DAG
from airflow.operators.python import PythonOperator

logger = logging.getLogger(__name__)


default_args = {
    "owner": "airflow",
    "depends_on_past": False,
    "start_date": datetime(2025, 7, 2),
}

with DAG(
    "dag_b",
    default_args=default_args,
    schedule_interval=None,
    dag_display_name="DAG B",
    tags=["test"],
) as dag:

    def test_b(**kwargs):
        name = kwargs["params"].get("name")
        age = kwargs["params"].get("age")

        logger.info(f"name: {name}, age: {age}")
        return {"name": name, "age": age}

    test_b = PythonOperator(
        task_id="test_b",
        python_callable=test_b,
        dag=dag,
    )

    (test_b)

From the UI you can see that dag_a uses TriggerDagRunOperator as its Operator:

dag_a

And dag_b behaves as described, receiving the two parameters and logging them:

dag_b

That’s all it takes to trigger one DAG from another using TriggerDagRunOperator.

This approach eliminates the need for REST API authentication and avoids the risks of network failures.
Since it’s an Operator, it also benefits from Airflow’s built-in retry, timeout, and dependency management features.
Finally, it keeps your DAG-to-DAG trigger logic explicit and easy to maintain.

Recent Posts

Categories

Tags