- Newest
- Most votes
- Most comments
Hi!
Unfortunately with MWAA the worker containers are both ephemeral and limited to user level access. The S3 operator should work with .sh files added to the /dags folder and referred to in the operator as /usr/local/airflow/dags/my_script.sh. The alternative would be to use the contents of your .py file from a Python operator and use the S3Hook to retrieve and store the file.
Thanks!
AWS support provided a work-around that involves changing the execute permissions of the file using an operator (i.e. using the Bash operator running CHMOD)
This fix has stopped work as of today.
Not sure what AWS updated.
Hi,
If you place a file called "test.sh" in your dags folder and call it with an operator like this:
shell_test = BashOperator(
task_id="shell_test",
bash_command="sh '/usr/local/airflow/dags/test.sh'"
)
It should work fine. Running chmod will not be persistent, because as soon as the worker is scaled or replaced for any reasons then your changes will be reverted.
Thanks!
Relevant content
- asked 3 months ago
- asked a year ago
- asked 3 years ago
- AWS OFFICIALUpdated 8 months ago
- AWS OFFICIALUpdated 10 months ago
- AWS OFFICIALUpdated 6 months ago