S3FileTransformOperator: Permission denied on script

0

This is what is happening:

Given I have a file at dags/transform,py containing a script to transform a file
When I execute the S3FileTransformOperator with the file as a parameter
Then the task failed with [Errno 13] Permission denied: 'dags/transform.py'

When running on a local machine, or on self-hosted Airflow, I am able to chmod +x the file which resolves the issue

What are my options on MWAA?

asked 2 years ago306 views
4 Answers
0

Hi!

Unfortunately with MWAA the worker containers are both ephemeral and limited to user level access. The S3 operator should work with .sh files added to the /dags folder and referred to in the operator as /usr/local/airflow/dags/my_script.sh. The alternative would be to use the contents of your .py file from a Python operator and use the S3Hook to retrieve and store the file.

Thanks!

John_J
answered 2 years ago
0

AWS support provided a work-around that involves changing the execute permissions of the file using an operator (i.e. using the Bash operator running CHMOD)

answered 2 years ago
0

This fix has stopped work as of today.
Not sure what AWS updated.

answered 2 years ago
0

Hi,

If you place a file called "test.sh" in your dags folder and call it with an operator like this:

shell_test = BashOperator(  
    task_id="shell_test",  
    bash_command="sh '/usr/local/airflow/dags/test.sh'"  
)  

It should work fine. Running chmod will not be persistent, because as soon as the worker is scaled or replaced for any reasons then your changes will be reverted.

Thanks!

John_J
answered 2 years ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions