S3FileTransformOperator: Permission denied on script

0

This is what is happening:

Given I have a file at dags/transform,py containing a script to transform a file
When I execute the S3FileTransformOperator with the file as a parameter
Then the task failed with [Errno 13] Permission denied: 'dags/transform.py'

When running on a local machine, or on self-hosted Airflow, I am able to chmod +x the file which resolves the issue

What are my options on MWAA?

4개 답변
0

Hi!

Unfortunately with MWAA the worker containers are both ephemeral and limited to user level access. The S3 operator should work with .sh files added to the /dags folder and referred to in the operator as /usr/local/airflow/dags/my_script.sh. The alternative would be to use the contents of your .py file from a Python operator and use the S3Hook to retrieve and store the file.

Thanks!

AWS
John_J
답변함 3년 전
0

AWS support provided a work-around that involves changing the execute permissions of the file using an operator (i.e. using the Bash operator running CHMOD)

답변함 3년 전
0

This fix has stopped work as of today.
Not sure what AWS updated.

답변함 3년 전
0

Hi,

If you place a file called "test.sh" in your dags folder and call it with an operator like this:

shell_test = BashOperator(  
    task_id="shell_test",  
    bash_command="sh '/usr/local/airflow/dags/test.sh'"  
)  

It should work fine. Running chmod will not be persistent, because as soon as the worker is scaled or replaced for any reasons then your changes will be reverted.

Thanks!

AWS
John_J
답변함 3년 전

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠