S3FileTransformOperator: Permission denied on script

0

This is what is happening:

Given I have a file at dags/transform,py containing a script to transform a file
When I execute the S3FileTransformOperator with the file as a parameter
Then the task failed with [Errno 13] Permission denied: 'dags/transform.py'

When running on a local machine, or on self-hosted Airflow, I am able to chmod +x the file which resolves the issue

What are my options on MWAA?

已提问 3 年前908 查看次数
4 回答
0

Hi!

Unfortunately with MWAA the worker containers are both ephemeral and limited to user level access. The S3 operator should work with .sh files added to the /dags folder and referred to in the operator as /usr/local/airflow/dags/my_script.sh. The alternative would be to use the contents of your .py file from a Python operator and use the S3Hook to retrieve and store the file.

Thanks!

AWS
John_J
已回答 3 年前
0

AWS support provided a work-around that involves changing the execute permissions of the file using an operator (i.e. using the Bash operator running CHMOD)

已回答 3 年前
0

This fix has stopped work as of today.
Not sure what AWS updated.

已回答 3 年前
0

Hi,

If you place a file called "test.sh" in your dags folder and call it with an operator like this:

shell_test = BashOperator(  
    task_id="shell_test",  
    bash_command="sh '/usr/local/airflow/dags/test.sh'"  
)  

It should work fine. Running chmod will not be persistent, because as soon as the worker is scaled or replaced for any reasons then your changes will be reverted.

Thanks!

AWS
John_J
已回答 3 年前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则