Unable to run a Python script using BashOperator

0

I need to run a Python script using BashOperator. I placed the script in dags folder of the S3 bucket and ran my DAG with BashOperator. It's throwing the error: python3: can't open file 'hello_world.py': No such file or directory

Also, I noticed that the Airflow worker runs the command in a temporary script location: say, /tmp/airflowtmpx03mq_qp/run_scriptlolkuw_q. Also this path changes on every DAG run. How can run the script in this scenario? Please help me with the easiest approach possible on Managed Airflow. Thanks

Edited by: ArjunAnaji on May 17, 2021 9:13 AM

質問済み 3年前2217ビュー
1回答
0
承認された回答

Hi!

Anything placed in the S3 folder containing your DAGs will be synced to /usr/local/airflow/dags on the MWAA worker. So, to run a file called "hello_world.py" you would use the bash command "python3 /usr/local/airflow/dags/hello_world.py"

Thanks!

AWS
John_J
回答済み 3年前
  • I'm still getting the same error even after using bash command "python3 /usr/local/airflow/dags/hello_world.py"

    Update: I got it clear, you just need to make sure you include subfolder names. Alternatively you can use the bash command "python3 ${AIRFLOW_HOME}/dags/folder1/folder2/../hello_world.py" Here, ${AIRFLOW_HOME} is /usr/local/airflow/

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ