EMR - Hive 3.1.3 with external metastore on RDS Aurora MySQL8

0

Hi Team, We are trying to setup hive with external metastore running in Aurora MySQL 8 , we are using emr 6.15.0 and we used the instructions from the AWS documentation .

We are able to successfully initialize the schema in rds via the hive schema tool but once we started creating tables using Hive / Pyspark in EMR , we are observing the below error.

pyspark.errors.exceptions.captured.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:javax.jdo.JDOUserException: Table "partition_keys" has been specified with a primary-key to include column "TBL_ID" but this column is not found in the table. Please check your <primary-key> column specification. NestedThrowables: org.datanucleus.exceptions.NucleusUserException: Table "partition_keys" has been specified with a primary-key to include column "TBL_ID" but this column is not found in the table. Please check your <primary-key> column specification.)

This is a big blocker for us in migration , the same setup works for us using Postgres as a external metastore but we have to use MySQL as we need to use that RDS for other services like Hue and Oozie as well.

Please help us on this

Thanks Prabakaran

已提问 3 个月前206 查看次数
1 回答
0

Found the problem , the mariadb connector that gets shipped with emr release version was the root cause of the above issue. We switched to https://github.com/awslabs/aws-mysql-jdbc driver and the issue is resolved. Hope this helps someone

Thanks Prabakaran

已回答 3 个月前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则