Unable to run a Python script using BashOperator

0

I need to run a Python script using BashOperator. I placed the script in dags folder of the S3 bucket and ran my DAG with BashOperator. It's throwing the error: python3: can't open file 'hello_world.py': No such file or directory

Also, I noticed that the Airflow worker runs the command in a temporary script location: say, /tmp/airflowtmpx03mq_qp/run_scriptlolkuw_q. Also this path changes on every DAG run. How can run the script in this scenario? Please help me with the easiest approach possible on Managed Airflow. Thanks

Edited by: ArjunAnaji on May 17, 2021 9:13 AM

已提問 3 年前檢視次數 2201 次
1 個回答
0
已接受的答案

Hi!

Anything placed in the S3 folder containing your DAGs will be synced to /usr/local/airflow/dags on the MWAA worker. So, to run a file called "hello_world.py" you would use the bash command "python3 /usr/local/airflow/dags/hello_world.py"

Thanks!

AWS
John_J
已回答 3 年前
  • I'm still getting the same error even after using bash command "python3 /usr/local/airflow/dags/hello_world.py"

    Update: I got it clear, you just need to make sure you include subfolder names. Alternatively you can use the bash command "python3 ${AIRFLOW_HOME}/dags/folder1/folder2/../hello_world.py" Here, ${AIRFLOW_HOME} is /usr/local/airflow/

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南