EMR - Hive 3.1.3 with external metastore on RDS Aurora MySQL8

0

Hi Team, We are trying to setup hive with external metastore running in Aurora MySQL 8 , we are using emr 6.15.0 and we used the instructions from the AWS documentation .

We are able to successfully initialize the schema in rds via the hive schema tool but once we started creating tables using Hive / Pyspark in EMR , we are observing the below error.

pyspark.errors.exceptions.captured.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:javax.jdo.JDOUserException: Table "partition_keys" has been specified with a primary-key to include column "TBL_ID" but this column is not found in the table. Please check your <primary-key> column specification. NestedThrowables: org.datanucleus.exceptions.NucleusUserException: Table "partition_keys" has been specified with a primary-key to include column "TBL_ID" but this column is not found in the table. Please check your <primary-key> column specification.)

This is a big blocker for us in migration , the same setup works for us using Postgres as a external metastore but we have to use MySQL as we need to use that RDS for other services like Hue and Oozie as well.

Please help us on this

Thanks Prabakaran

已提問 3 個月前檢視次數 207 次
1 個回答
0

Found the problem , the mariadb connector that gets shipped with emr release version was the root cause of the above issue. We switched to https://github.com/awslabs/aws-mysql-jdbc driver and the issue is resolved. Hope this helps someone

Thanks Prabakaran

已回答 3 個月前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南