- 新しい順
- 投票が多い順
- コメントが多い順
Hi!
Unfortunately with MWAA the worker containers are both ephemeral and limited to user level access. The S3 operator should work with .sh files added to the /dags folder and referred to in the operator as /usr/local/airflow/dags/my_script.sh. The alternative would be to use the contents of your .py file from a Python operator and use the S3Hook to retrieve and store the file.
Thanks!
AWS support provided a work-around that involves changing the execute permissions of the file using an operator (i.e. using the Bash operator running CHMOD)
Hi,
If you place a file called "test.sh" in your dags folder and call it with an operator like this:
shell_test = BashOperator(
task_id="shell_test",
bash_command="sh '/usr/local/airflow/dags/test.sh'"
)
It should work fine. Running chmod will not be persistent, because as soon as the worker is scaled or replaced for any reasons then your changes will be reverted.
Thanks!