Extracting tables DynamoDB->S3 : HIVE_BAD_DATA

0

I have DynamoDB, do ETL to S3 in AWS Glue. What I did:

  1. I have added 6 tables to a job and then in Athene executed query for each table select * from t; they work fine.
  2. Then I have added 7th table and executed queries but now on each table I have received the error: HIVE_BAD_DATA: Field _lastchangedat's type BINARY in parquet file s3://datawarehouse.glue.extract.output/run-AmazonS3_node1647877440125-16-part-block-0-r-00003-snappy.parquet is incompatible with type bigint defined in table schema

Yes, each table has the field _lastchangedat and it mapped to long in output table.

The question: how find the reason of this error? How to fix the error?

Oleg
已提問 2 年前檢視次數 70 次
沒有答案

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南