AWS Glue python job pyspark libraries

0

How to include pyspark libraries when running a AWS Glue Python job.

asked a year ago308 views
1 Answer
2

The question isn't clear the way it is worded.

Let me clarify based on what I understand. If your question is how to include pyspark libraries in a AWS Glue Python Shell job, this cannot be done as the computing for this option does not involve drivers and operators that a Spark engine would need. This uses a single compute resource.

For AWS Glye Pyspark job, you import the libraries as usual. WHen you create a job from the console, it should provide you the default imports needed to start.

profile pictureAWS
answered a year ago
AWS
EXPERT
reviewed a year ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions