glue job fail for file encoding

0

Hi team,

I have a glue job that read from an S3 CSV file and inject it to DB,

I have the following error while running the job,

I think it's related to the file encoding,

the original file encoding is : ** ISO-8859-1**

if I change manually the file encoding to be UTF-8, the glue job passes.

do I need to have the CSV file encoded in utf-8 to be able to run successfully the job way? is there any way to go around this?

Thank you!

com.amazonaws.services.glue.util.FatalException: Unable to parse file: myFile.csv\n\n\tat com.amazonaws.services.glue.readers.JacksonReader.hasNextFailSafe(JacksonReader.scala:94)\n\tat com.amazonaws.services.glue.readers.JacksonReader.hasNext(JacksonReader.scala:38)\n\tat com.amazonaws.services.glue.readers.CSVReader.hasNext(CSVReader.scala:169)\n\tat com.amazonaws.services.glue.hadoop.TapeHadoopRecordReaderSplittable.nextKeyValue(TapeHadoopRecordReaderSplittable.scala:97)\n\tat org.apache.spark.rdd.NewHadoopRDD$$anon$1.hasNext(NewHadoopRDD.scala:247)\n\tat org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)\n\tat
Jess
asked 2 years ago2089 views
1 Answer
0

Hi - You need to convert the character encoding from ISO-8859-1 to UTF-8 before letting AWS Glue process it.

https://docs.aws.amazon.com/glue/latest/dg/components-key-concepts.html

Text-based data, such as CSVs, must be encoded in UTF-8 for AWS Glue to process it successfully.

There are few examples listed here -https://github.com/aws-samples/aws-glue-samples/blob/master/examples/converting_char_encoding.md which use spark to convert the datatype.

AWS
EXPERT
Gokul
answered 2 years ago
AWS
EXPERT
reviewed 2 years ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions