EMR - Hive 3.1.3 with external metastore on RDS Aurora MySQL8

0

Hi Team, We are trying to setup hive with external metastore running in Aurora MySQL 8 , we are using emr 6.15.0 and we used the instructions from the AWS documentation .

We are able to successfully initialize the schema in rds via the hive schema tool but once we started creating tables using Hive / Pyspark in EMR , we are observing the below error.

pyspark.errors.exceptions.captured.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:javax.jdo.JDOUserException: Table "partition_keys" has been specified with a primary-key to include column "TBL_ID" but this column is not found in the table. Please check your <primary-key> column specification. NestedThrowables: org.datanucleus.exceptions.NucleusUserException: Table "partition_keys" has been specified with a primary-key to include column "TBL_ID" but this column is not found in the table. Please check your <primary-key> column specification.)

This is a big blocker for us in migration , the same setup works for us using Postgres as a external metastore but we have to use MySQL as we need to use that RDS for other services like Hue and Oozie as well.

Please help us on this

Thanks Prabakaran

질문됨 3달 전208회 조회
1개 답변
0

Found the problem , the mariadb connector that gets shipped with emr release version was the root cause of the above issue. We switched to https://github.com/awslabs/aws-mysql-jdbc driver and the issue is resolved. Hope this helps someone

Thanks Prabakaran

답변함 3달 전

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인