Glue ETL PySpark Job Fails after Upgrade from Glue Version 2.0 to 3.0 error occurred while calling pyWriteDynamicFrame EOFException occurred while reading the port number from pyspark.daemon's stdout

0

A PySpark Glue ETL job fails after upgrading from Glue Version 2.0 to 3.0.

The job fails while writing a DynamicFrame to s3 in parquet format.

error snippet from log: py4j.protocol.Py4JJavaError: An error occurred while calling o433.pyWriteDynamicFrame. : org.apache.spark.SparkException: Job aborted

Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 30.2 failed 4 times, most recent failure: Lost task 1.3 in stage 30.2 (TID 2007594) (10.78.9.145 executor 8): org.apache.spark.SparkException: EOFException occurred while reading the port number from pyspark.daemon's stdout

mrjimi
已提問 2 年前檢視次數 2844 次
2 個答案
0

I changed the worker_type from G.1X to G.2X and the job completed successfully, albeit in 38 hours. So then I tuned the Spark code so that the 3 Dataframes are all partitioned on the same value .repartition("attribute_name"), and also doubled the number of workers from 5 to 10. Then the job completed successfully in 1 hr 20 mins. The partitioning helped the JOIN that was being done to create the final dataset being written to s3.

mrjimi
已回答 2 年前
  • thank you for feedback. How long was it taking with Glue 2.0? were you using the same number of nodes?

0

Hi,

have you looked at the documentation about migrating Glue from version 2.0 to 3.0 ? and Spark 2 to Spark 3?

Do you use external libraries?

Contacting AWS Support might be the fastest way to resolve your issue if you cannot find any indication in the documentation shared, without seeing the job itself it is difficult to provide more prescriptive guidance.

hope this helps

AWS
專家
已回答 2 年前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南