S3FileTransformOperator: Permission denied on script

0

This is what is happening:

Given I have a file at dags/transform,py containing a script to transform a file
When I execute the S3FileTransformOperator with the file as a parameter
Then the task failed with [Errno 13] Permission denied: 'dags/transform.py'

When running on a local machine, or on self-hosted Airflow, I am able to chmod +x the file which resolves the issue

What are my options on MWAA?

質問済み 3年前911ビュー
4回答
0

Hi!

Unfortunately with MWAA the worker containers are both ephemeral and limited to user level access. The S3 operator should work with .sh files added to the /dags folder and referred to in the operator as /usr/local/airflow/dags/my_script.sh. The alternative would be to use the contents of your .py file from a Python operator and use the S3Hook to retrieve and store the file.

Thanks!

AWS
John_J
回答済み 3年前
0

AWS support provided a work-around that involves changing the execute permissions of the file using an operator (i.e. using the Bash operator running CHMOD)

回答済み 3年前
0

This fix has stopped work as of today.
Not sure what AWS updated.

回答済み 3年前
0

Hi,

If you place a file called "test.sh" in your dags folder and call it with an operator like this:

shell_test = BashOperator(  
    task_id="shell_test",  
    bash_command="sh '/usr/local/airflow/dags/test.sh'"  
)  

It should work fine. Running chmod will not be persistent, because as soon as the worker is scaled or replaced for any reasons then your changes will be reverted.

Thanks!

AWS
John_J
回答済み 3年前

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ