I have DynamoDB, do ETL to S3 in AWS Glue.
What I did:
- I have added 6 tables to a job and then in Athene executed query for each table
select * from t;
they work fine.
- Then I have added 7th table and executed queries but now on each table I have received the error:
HIVE_BAD_DATA: Field _lastchangedat's type BINARY in parquet file s3://datawarehouse.glue.extract.output/run-AmazonS3_node1647877440125-16-part-block-0-r-00003-snappy.parquet is incompatible with type bigint defined in table schema
Yes, each table has the field _lastchangedat and it mapped to long in output table.
The question: how find the reason of this error?
How to fix the error?