GENERIC_INTERNAL_ERROR: SerDeException thrown initializing deserializer org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe. Cause: *same name*: columns has 110 elements while columns.types has 106!

0

This error happens to me anytime I try to query the 'airports' table in my database which was created by a glue crawler I created for this purpose along with 3 other datasets. They were all csv files beforehand, and I have performed each and every step of this process already in MySQL Workbench locally before trying it out on the cloud in AWS right now and I never got any errors like this when querying my airports dataset there. The Query Id this time was: 5e5d2429-7332-466a-abc7-3e920cfe9bda by the way, all I did was run this query:

SELECT * FROM "runway_db_athena"."airports" limit 10;

And again, I got:

GENERIC_INTERNAL_ERROR: SerDeException thrown initializing deserializer org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe. Cause: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe: columns has 110 elements while columns.types has 106 elements! This query ran against the "runway_db_athena" database, unless qualified by the query. Please post the error message on our forum or contact customer support with Query Id: 5e5d2429-7332-466a-abc7-3e920cfe9bda

What does this mean and how do I fix it?

profile picture
已提問 1 年前檢視次數 481 次
1 個回答
1

That normally means you have a csv with header and it has more fields than the ones identified by the crawler, normally this is due because of a csv parsing issue such as having a comma inside the field name so that when split it results on more "columns"

profile pictureAWS
專家
已回答 1 年前
  • Interesting, how would one get around this so that they can run queries on this table regardless given that all 3 of the other csv files in the folder crawled over by Glue came out as perfectly normal tables and now the task is to join them all together?

  • I just created a new database in MySQL Workbench, then used the Table Data Import Wizard it has to upload this same csv into that new database, then ran DESC and SELECT * from it and no such problems occurred, just as I suspected, so I think there is something else going on here. Maybe.

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南