Unable to run a Python script using BashOperator

0

I need to run a Python script using BashOperator. I placed the script in dags folder of the S3 bucket and ran my DAG with BashOperator. It's throwing the error: python3: can't open file 'hello_world.py': No such file or directory

Also, I noticed that the Airflow worker runs the command in a temporary script location: say, /tmp/airflowtmpx03mq_qp/run_scriptlolkuw_q. Also this path changes on every DAG run. How can run the script in this scenario? Please help me with the easiest approach possible on Managed Airflow. Thanks

Edited by: ArjunAnaji on May 17, 2021 9:13 AM

preguntada hace 3 años2201 visualizaciones
1 Respuesta
0
Respuesta aceptada

Hi!

Anything placed in the S3 folder containing your DAGs will be synced to /usr/local/airflow/dags on the MWAA worker. So, to run a file called "hello_world.py" you would use the bash command "python3 /usr/local/airflow/dags/hello_world.py"

Thanks!

AWS
John_J
respondido hace 3 años
  • I'm still getting the same error even after using bash command "python3 /usr/local/airflow/dags/hello_world.py"

    Update: I got it clear, you just need to make sure you include subfolder names. Alternatively you can use the bash command "python3 ${AIRFLOW_HOME}/dags/folder1/folder2/../hello_world.py" Here, ${AIRFLOW_HOME} is /usr/local/airflow/

No has iniciado sesión. Iniciar sesión para publicar una respuesta.

Una buena respuesta responde claramente a la pregunta, proporciona comentarios constructivos y fomenta el crecimiento profesional en la persona que hace la pregunta.

Pautas para responder preguntas