This repository contains example DAGs showing features released in Apache Airflow 2.7.
Aside from core Apache Airflow this project uses:
- The Astro CLI to run Airflow locally (version 1.17.0).
- The Airflow Apprise Provider.
- The Apprise package.
- The Airflow Amazon Provider.
- The Airflow HTTP Provider.
- The Airflow Common SQL Provider.
- The Airflow SQLite Provider.
For pinned versions of the provider packages see the requirements.txt
file.
This section explains how to run this repository with Airflow.
Note
For some DAGs you will need to define extra connections. You can either set these up yourself or create a .env
file and copy the contents from .env.example
into it. You will need to replace the values in .env
with your own credentials.
See the Manage Connections in Apache Airflow guide for further instructions on Airflow connections.
DAGs with the tag core
work without any additional connections or tools.
Run this Airflow project without installing anything locally.
- Fork this repository.
- Create a new GitHub codespaces project on your fork. Make sure it uses at least 4 cores!
- After creating the codespaces project the Astro CLI will automatically start up all necessary Airflow components. This can take a few minutes.
- Once the Airflow project has started, access the Airflow UI by clicking on the Ports tab and opening the forward URL for port 8080.
Download the Astro CLI to run Airflow locally in Docker. astro
is the only package you will need to install.
- Run
git clone https://github.com/astronomer/2-7-example-dags.git
on your computer to create a local clone of this repository. - Install the Astro CLI by following the steps in the Astro CLI documentation. Docker Desktop/Docker Engine is a prerequisite, but you don't need in-depth Docker knowledge to run Airflow with the Astro CLI.
- Run
astro dev start
in your cloned repository. - After your Astro project has started. View the Airflow UI at
localhost:8080
.
The following sections list the DAGs shown sorted by the feature that they showcase. You can filter DAGs in the UI by their tags
.
DAGs that showcase the setup
and teardown
tasks added in Airflow 2.7. For further information see the setup/ teardown guide.
Four use case DAGs:
setup_teardown_cleanup_xcom
: DAG that plays Texas Hold'em Poker. Shows hot to use a teardown task to clean up XComs. Needs a custom XCom backend using S3 and a connection to AWS. The relevant environment variables are shown in.env.example
and the custom XCom backend can be imported frominclude/custom_xcom_backend/s3_xcom_backend.py
.setup_teardown_complex_sqlite_decorators
: DAG that shows 3 nested setup/ teardown workflows modifying data about Star Trek in a SQLite database. Has a data quality check using the SQLColumnCheckOperator for which it needs a connection to the SQLite database (see.env.example
or the DAG description).setup_teardown_csv_decorators
: A setup/teardown pipeline that can be run locally which works on a CSV file. Uses decorators.setup_teardown_csv_methods
A setup/teardown pipeline that can be run locally which works on a CSV file. Uses methods.
Three toy DAGs:
-
toy_setup_teardown_simple
: Shows a very simple setup/ teardown workflow to nest setup/ teardown tasks with empty@task
tasks. -
toy_setup_teardown_nesting
: Shows how to nest setup/ teardown tasks with empty@task
tasks. -
toy_setup_teardown_task_group_and_failures
: DAG that shows special behavior of teardown tasks in a task group. All tasks can be set to fail individually via params to explore the behavior of the DAG. -
setup_teardown_csv_NO_setup_teardown
: This DAG exists as a comparison to thesetup_teardown_csv_methods
andsetup_teardown_csv_decorators
DAG. It does not use setup/ teardown tasks.
DAGs that showcase the dependency functions chain()
, cross_downstream()
and the dependency function added in Airflow 2.7: chain_linear()
.
toy_chain_linear_vs_chain_simple
: Simple comparison betweenchain()
andchain_linear()
.toy_chain_linear_vs_chain_complex
: Shows howchain_linear()
allows dependencies between lists of different lengths.toy_chain_linear_task_group
: Showschain_linear()
being used with tasks and task groups.toy_cross_downstream
: Shows how to usecross_downstream()
to compare tochain_linear()
(the former can only take 2 positional arguments).
DAGs that showcase other features added in Airflow 2.7.
toy_apprise_provider_example
: Shows how to use the Apprise provider to send notifications. This DAG needs an apprise connectionapprise_default
to be configured.toy_deferrable_operators_config
: Shows how to use thedeferrable
parameter in theTriggerDagRunOperator
to defer the execution of a DAG. The config that was added in 2.7 is set toTrue
in the DockerfileENV AIRFLOW__OPERATORS__DEFAULT_DEFERRABLE=True
.toy_fail_stop
: DAG that hasfail_stop
enabled with tasks that take different amounts of time to finish.
DAGs that are here to support another DAG.
helper_dag_wait_30_seconds
: DAG that is triggered by the TriggerDagRunOperator in thetoy_deferrable_operators_config
DAG. It waits 30 seconds before completing.
- Setup/ teardown guide.
- Managing task dependencies in Airflow guide.
- Manage Airflow DAG notifications guide.
- Deferrable operators guide.
This repository contains the following files and folders:
.astro
: files necessary for Astro CLI commands..devcontainer
: the GH codespaces configuration.dags
: all DAGs in your Airflow environment. Files in this folder will be parsed by the Airflow scheduler when looking for DAGs to add to your environment. You can add your own dagfiles in this folder.include
: supporting files that will be included in the Airflow environment.custom_xcom_backend
: folder.s3_xcom_backend.py
: contains a custom XCom backend using S3. See also custom XCom backends.
plugins
: folder to place Airflow plugins. Empty.tests
: folder to place pytests running on DAGs in the Airflow instance. Contains default tests..astro-registry.yaml
: file to configure DAGs being uploaded to the Astronomer registry. Can be ignored for local development..dockerignore
: list of files to ignore for Docker..env.example
: example environment variables for the DAGs in this repository. Copy this file to.env
and replace the values with your own credentials..gitignore
: list of files to ignore for git.Dockerfile
: the Dockerfile using the Astro CLI.packages.txt
: system-level packages to be installed in the Airflow environment upon building of the Dockerimage. Empty.README.md
: this Readme.requirements.txt
: python packages to be installed to be used by DAGs upon building of the Dockerimage.