Running PySpark Jobs Locally with the AWS Glue ETL library - On windows


I have followed the steps outlined to install Developing using the AWS Glue ETL library - Python on Windows found here:

After following the installation instructions, it's unclear how to actually execute a spark job successfully locally In powershell I have:

a simple pyspark job called

from pyspark.context import SparkContext
from pyspark.sql import SparkSession

sc = SparkContext()
spark = SparkSession(sc)

I have:

  1. navigated to the aws-glue-libs directory
  2. in PowerShell attempted.\bin\gluesparksubmit F:\programming\
  3. the output seems to be nothing

Can you please provide correct example on how to execute a aws glue job locally?

The ultimate goal here is to develop my glue jobs locally in Pycharm before deploying to the AWS Glue Service.

demandé il y a un an485 vues
1 réponse

Those scripts are for Linux Bash, not PowerShell.
It might be possible to get then work using a Cygwin shell but it might be easier if you use the docker option

profile pictureAWS
répondu il y a un an

Vous n'êtes pas connecté. Se connecter pour publier une réponse.

Une bonne réponse répond clairement à la question, contient des commentaires constructifs et encourage le développement professionnel de la personne qui pose la question.

Instructions pour répondre aux questions