Failed ETL Job

1

Hi,

I'm trying to write data from an RDS Aurora database (source) to S3 (target) using an AWS Glue job and I'm getting this error:

py4j.protocol.Py4JJavaError: An error occurred while calling o107.pyWriteDynamicFrame.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 4 times, most recent failure: Lost task 0.3 in stage 1.0 (TID 4, ip-172-30-x-x.ec2.internal, executor 1): com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure

The last packet successfully received from the server was 129,902 milliseconds ago. The last packet sent successfully to the server was 129,902 milliseconds ago.
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
...
Caused by: java.io.EOFException: Can not read response from server. Expected to read 4 bytes, read 0 bytes before connection was unexpectedly lost.

Any idea of what could be happening?

1010010
asked 5 years ago463 views
1 Answer
0

Hi, I've got the same error when trying to ETL from S3 to RDS Aurora Servrless destination, but connection test in Glue UI is shown as successful.
Job run ID: jr_d538e6bd5b30686c1d9a5b12fea74e131ab64308ad3c7ee03991ec0ebf2885e8
Have you been able to resolve this issue?

answered 5 years ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions