glue job fail for file encoding

0

Hi team,

I have a glue job that read from an S3 CSV file and inject it to DB,

I have the following error while running the job,

I think it's related to the file encoding,

the original file encoding is : ** ISO-8859-1**

if I change manually the file encoding to be UTF-8, the glue job passes.

do I need to have the CSV file encoded in utf-8 to be able to run successfully the job way? is there any way to go around this?

Thank you!

com.amazonaws.services.glue.util.FatalException: Unable to parse file: myFile.csv\n\n\tat com.amazonaws.services.glue.readers.JacksonReader.hasNextFailSafe(JacksonReader.scala:94)\n\tat com.amazonaws.services.glue.readers.JacksonReader.hasNext(JacksonReader.scala:38)\n\tat com.amazonaws.services.glue.readers.CSVReader.hasNext(CSVReader.scala:169)\n\tat com.amazonaws.services.glue.hadoop.TapeHadoopRecordReaderSplittable.nextKeyValue(TapeHadoopRecordReaderSplittable.scala:97)\n\tat org.apache.spark.rdd.NewHadoopRDD$$anon$1.hasNext(NewHadoopRDD.scala:247)\n\tat org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)\n\tat
Jess
質問済み 2年前2116ビュー
1回答
0

Hi - You need to convert the character encoding from ISO-8859-1 to UTF-8 before letting AWS Glue process it.

https://docs.aws.amazon.com/glue/latest/dg/components-key-concepts.html

Text-based data, such as CSVs, must be encoded in UTF-8 for AWS Glue to process it successfully.

There are few examples listed here -https://github.com/aws-samples/aws-glue-samples/blob/master/examples/converting_char_encoding.md which use spark to convert the datatype.

AWS
エキスパート
Gokul
回答済み 2年前
AWS
エキスパート
レビュー済み 2年前

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ