1 Risposta
- Più recenti
- Maggior numero di voti
- Maggior numero di commenti
0
This is just a warning and should not be cause for not writing data. I'd suggest you to enable driver logs, which would provide you additional details to look into the issue and isolate it accordingly.
If you'd enable SparkUI logs and there you can check in history server to see what the driver is doing. check again, was the driver started a Spark job/stage and didn't get resources or it got resources but something further happened.
Also, sometimes it depends on stages in the job and sparkUI should able to help you find, in which stage the problem exists.
References:
Glue Known Issues Glue parallel read jobs
Hope you find this useful.
Contenuto pertinente
- AWS UFFICIALEAggiornata 2 anni fa
- AWS UFFICIALEAggiornata 2 anni fa
I have access to the spark logs, but it's a big 17207 line json file. I downloaded them to my machine. I also installed spark locally and opened the spark web ui.
Not sure how to view the downloaded logs in my local spark web ui. Any pointers?