glue job fail for file encoding

0

Hi team,

I have a glue job that read from an S3 CSV file and inject it to DB,

I have the following error while running the job,

I think it's related to the file encoding,

the original file encoding is : ** ISO-8859-1**

if I change manually the file encoding to be UTF-8, the glue job passes.

do I need to have the CSV file encoded in utf-8 to be able to run successfully the job way? is there any way to go around this?

Thank you!

com.amazonaws.services.glue.util.FatalException: Unable to parse file: myFile.csv\n\n\tat com.amazonaws.services.glue.readers.JacksonReader.hasNextFailSafe(JacksonReader.scala:94)\n\tat com.amazonaws.services.glue.readers.JacksonReader.hasNext(JacksonReader.scala:38)\n\tat com.amazonaws.services.glue.readers.CSVReader.hasNext(CSVReader.scala:169)\n\tat com.amazonaws.services.glue.hadoop.TapeHadoopRecordReaderSplittable.nextKeyValue(TapeHadoopRecordReaderSplittable.scala:97)\n\tat org.apache.spark.rdd.NewHadoopRDD$$anon$1.hasNext(NewHadoopRDD.scala:247)\n\tat org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)\n\tat
Jess
gefragt vor 2 Jahren2116 Aufrufe
1 Antwort
0

Hi - You need to convert the character encoding from ISO-8859-1 to UTF-8 before letting AWS Glue process it.

https://docs.aws.amazon.com/glue/latest/dg/components-key-concepts.html

Text-based data, such as CSVs, must be encoded in UTF-8 for AWS Glue to process it successfully.

There are few examples listed here -https://github.com/aws-samples/aws-glue-samples/blob/master/examples/converting_char_encoding.md which use spark to convert the datatype.

AWS
EXPERTE
Gokul
beantwortet vor 2 Jahren
AWS
EXPERTE
überprüft vor 2 Jahren

Du bist nicht angemeldet. Anmelden um eine Antwort zu veröffentlichen.

Eine gute Antwort beantwortet die Frage klar, gibt konstruktives Feedback und fördert die berufliche Weiterentwicklung des Fragenstellers.

Richtlinien für die Beantwortung von Fragen