Unable to run a Python script using BashOperator

0

I need to run a Python script using BashOperator. I placed the script in dags folder of the S3 bucket and ran my DAG with BashOperator. It's throwing the error: python3: can't open file 'hello_world.py': No such file or directory

Also, I noticed that the Airflow worker runs the command in a temporary script location: say, /tmp/airflowtmpx03mq_qp/run_scriptlolkuw_q. Also this path changes on every DAG run. How can run the script in this scenario? Please help me with the easiest approach possible on Managed Airflow. Thanks

Edited by: ArjunAnaji on May 17, 2021 9:13 AM

feita há 3 anos2201 visualizações
1 Resposta
0
Resposta aceita

Hi!

Anything placed in the S3 folder containing your DAGs will be synced to /usr/local/airflow/dags on the MWAA worker. So, to run a file called "hello_world.py" you would use the bash command "python3 /usr/local/airflow/dags/hello_world.py"

Thanks!

AWS
John_J
respondido há 3 anos
  • I'm still getting the same error even after using bash command "python3 /usr/local/airflow/dags/hello_world.py"

    Update: I got it clear, you just need to make sure you include subfolder names. Alternatively you can use the bash command "python3 ${AIRFLOW_HOME}/dags/folder1/folder2/../hello_world.py" Here, ${AIRFLOW_HOME} is /usr/local/airflow/

Você não está conectado. Fazer login para postar uma resposta.

Uma boa resposta responde claramente à pergunta, dá feedback construtivo e incentiva o crescimento profissional de quem perguntou.

Diretrizes para responder a perguntas