(complete documentation here)
Composer Local Development CLI tool streamlines Apache Airflow DAG development for Cloud Composer 2 by running an Airflow environment locally. This local Airflow environment uses an image of a specific Cloud Composer version.
In order to run the CLI tool, install the following prerequisites:
- Python 3.7-3.10 with
pip
- gcloud CLI
- Docker
Docker must be installed and running in the local system.
If not already done, get new user credentials to use for Application Default Credentials:
gcloud auth application-default login
Login in gcloud
using your Google account:
gcloud auth login
-
Clone this repository
-
Create a virtual enviroment and activate (personal recomendation ๐)
python -m venv env source env/bin/activate
-
In the top-level directory of the cloned repository, run:
pip install .
Only the following information is taken from a Cloud Composer environment:
-
Image version (versions of Cloud Composer and Airflow used in your environment).
-
List of custom PyPI packages installed in your environment.
-
Commented list of names of environment variables set in your environment.
Important: Cloud Composer does not copy the values of environment variables. You can manually uncomment environment variables in the configuration file and set their values, as required.
Other information and configuration parameters from the environment, such as DAG files, DAG run history, Airflow variables, and connections, are not copied from your Composer environment.
To create a local Airflow environment from an existing Cloud Composer environment:
composer-dev create example-local-environment \
--from-source-environment example-environment \
--location us-central1 \
--project example-project \
--port 8081 \
--dags-path example_directory/dags
To start a local Airflow environment, run:
composer-dev start LOCAL_ENVIRONMENT_NAME
When you restart a local Airflow environment, Composer Local Development CLI tool restarts the Docker container where the environment runs. All Airflow components are stopped and started again. As a result, all DAG runs that are executed during a restart are marked as failed.
To restart or start a stopped local Airflow environment, run:
composer-dev restart LOCAL_ENVIRONMENT_NAME
To stop a local Airflow environment, run:
composer-dev stop LOCAL_ENVIRONMENT_NAME
Dags are stored in the directory that you specified in the --dags-path
parameter when you created your local Airflow environment. By default, this
directory is ./composer/<local_environment_name>/dags
. You can get the
directory used by your environment with the
describe
command.
To add and update DAGs, change files in this directory. You do not need to restart your local Airflow environment.
To configure environment variables, edit the variables.env
file in the
environment directory:
./composer/<local_environment_name>/variables.env
.
The variables.env
file must contain key-value definitions, one line for each
environment variable.
EXAMPLE_VARIABLE=True
ANOTHER_VARIABLE=test
AIRFLOW__WEBSERVER__DAG_DEFAULT_VIEW=graph
To install or remove PyPI packages, modify the requirements.txt
file in the
environment directory: ./composer/<local_environment_name>/requirements.txt
.
To apply these last changes, restart your local Airflow environment.