Unable to run a Python script using BashOperator

0

I need to run a Python script using BashOperator. I placed the script in dags folder of the S3 bucket and ran my DAG with BashOperator. It's throwing the error: python3: can't open file 'hello_world.py': No such file or directory

Also, I noticed that the Airflow worker runs the command in a temporary script location: say, /tmp/airflowtmpx03mq_qp/run_scriptlolkuw_q. Also this path changes on every DAG run. How can run the script in this scenario? Please help me with the easiest approach possible on Managed Airflow. Thanks

Edited by: ArjunAnaji on May 17, 2021 9:13 AM

gefragt vor 3 Jahren2218 Aufrufe
1 Antwort
0
Akzeptierte Antwort

Hi!

Anything placed in the S3 folder containing your DAGs will be synced to /usr/local/airflow/dags on the MWAA worker. So, to run a file called "hello_world.py" you would use the bash command "python3 /usr/local/airflow/dags/hello_world.py"

Thanks!

AWS
John_J
beantwortet vor 3 Jahren
  • I'm still getting the same error even after using bash command "python3 /usr/local/airflow/dags/hello_world.py"

    Update: I got it clear, you just need to make sure you include subfolder names. Alternatively you can use the bash command "python3 ${AIRFLOW_HOME}/dags/folder1/folder2/../hello_world.py" Here, ${AIRFLOW_HOME} is /usr/local/airflow/

Du bist nicht angemeldet. Anmelden um eine Antwort zu veröffentlichen.

Eine gute Antwort beantwortet die Frage klar, gibt konstruktives Feedback und fördert die berufliche Weiterentwicklung des Fragenstellers.

Richtlinien für die Beantwortung von Fragen