I'm searching for a good way to automate migrating a DAG between multiple instances (staging/production) as part of a DevOps workflow. I would like to be able to run my DAGs in my staging environment with different configuration parameters (S3 bucket paths, etc.) and run the same DAG in my production environment without requiring a change to the DAG code (automate the migration).
Here is what I'm considering:
- Set an environment variable in Airflow/MWAA instance as part of initial setup (e.g. env=staging, env=prod)
- Create json configuration file with staging and production configuration parameters and store it with the DAGs
- Create a DAG that is a prerequisite for any DAGs which require configuration that checks Airflow environment variable and sets variables to staging/prod configuration parameters
- Use templated variables in DAGs requiring configuration
Is there a better way to approach this? Any advice is appreciated!
Thanks, your answer was helpful. Do the plugins, and subsequently the environment variables, get re-loaded only when the environment is built/re-built?