glue job fail for file encoding

0

Hi team,

I have a glue job that read from an S3 CSV file and inject it to DB,

I have the following error while running the job,

I think it's related to the file encoding,

the original file encoding is : ** ISO-8859-1**

if I change manually the file encoding to be UTF-8, the glue job passes.

do I need to have the CSV file encoded in utf-8 to be able to run successfully the job way? is there any way to go around this?

Thank you!

com.amazonaws.services.glue.util.FatalException: Unable to parse file: myFile.csv\n\n\tat com.amazonaws.services.glue.readers.JacksonReader.hasNextFailSafe(JacksonReader.scala:94)\n\tat com.amazonaws.services.glue.readers.JacksonReader.hasNext(JacksonReader.scala:38)\n\tat com.amazonaws.services.glue.readers.CSVReader.hasNext(CSVReader.scala:169)\n\tat com.amazonaws.services.glue.hadoop.TapeHadoopRecordReaderSplittable.nextKeyValue(TapeHadoopRecordReaderSplittable.scala:97)\n\tat org.apache.spark.rdd.NewHadoopRDD$$anon$1.hasNext(NewHadoopRDD.scala:247)\n\tat org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)\n\tat
Jess
已提問 2 年前檢視次數 2116 次
1 個回答
0

Hi - You need to convert the character encoding from ISO-8859-1 to UTF-8 before letting AWS Glue process it.

https://docs.aws.amazon.com/glue/latest/dg/components-key-concepts.html

Text-based data, such as CSVs, must be encoded in UTF-8 for AWS Glue to process it successfully.

There are few examples listed here -https://github.com/aws-samples/aws-glue-samples/blob/master/examples/converting_char_encoding.md which use spark to convert the datatype.

AWS
專家
Gokul
已回答 2 年前
AWS
專家
已審閱 2 年前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南