S3FileTransformOperator: Permission denied on script

0

This is what is happening:

Given I have a file at dags/transform,py containing a script to transform a file
When I execute the S3FileTransformOperator with the file as a parameter
Then the task failed with [Errno 13] Permission denied: 'dags/transform.py'

When running on a local machine, or on self-hosted Airflow, I am able to chmod +x the file which resolves the issue

What are my options on MWAA?

posta 3 anni fa917 visualizzazioni
4 Risposte
0

Hi!

Unfortunately with MWAA the worker containers are both ephemeral and limited to user level access. The S3 operator should work with .sh files added to the /dags folder and referred to in the operator as /usr/local/airflow/dags/my_script.sh. The alternative would be to use the contents of your .py file from a Python operator and use the S3Hook to retrieve and store the file.

Thanks!

AWS
John_J
con risposta 3 anni fa
0

AWS support provided a work-around that involves changing the execute permissions of the file using an operator (i.e. using the Bash operator running CHMOD)

con risposta 3 anni fa
0

This fix has stopped work as of today.
Not sure what AWS updated.

con risposta 3 anni fa
0

Hi,

If you place a file called "test.sh" in your dags folder and call it with an operator like this:

shell_test = BashOperator(  
    task_id="shell_test",  
    bash_command="sh '/usr/local/airflow/dags/test.sh'"  
)  

It should work fine. Running chmod will not be persistent, because as soon as the worker is scaled or replaced for any reasons then your changes will be reverted.

Thanks!

AWS
John_J
con risposta 3 anni fa

Accesso non effettuato. Accedi per postare una risposta.

Una buona risposta soddisfa chiaramente la domanda, fornisce un feedback costruttivo e incoraggia la crescita professionale del richiedente.

Linee guida per rispondere alle domande