Spark with scala 2.12.x on EMR

0

Is it a way to configure our emr-cluster to use scala 2.12.x insttead of scala 2.11.x with Spark 2.4.x ?

Edited by: razou on May 24, 2019 9:20 AM

razou
질문됨 5년 전1357회 조회
2개 답변
0

Hi,

On EMR, Spark is built with Scala-2.11.x, which is currently the stable version. As per- https://spark.apache.org/releases/spark-release-2-4-0.html , Scala-2.12 is still under experimental support.
Our service team is already aware of this feature request, and they shall be adding Scala-2.12.0 support in coming releases, once it becomes stable. We don't have an ETA for this currently. AWS product teams regularly look into feature requests and bug reports, however, they have the final say on when to implement or fix them according to their own R&D schedules. Any new changes for EMR can be tracked at "https://docs.aws.amazon.com/emr/latest/ReleaseGuide/emr-whatsnew.html ".

WORKAROUND:
For Workaround, you may build your own Spark and install it on EMR cluster as an application. EMR used Apache BigTop libraries to install applications(like Spark, hive etc.). Please refer(https://aws.amazon.com/blogs/big-data/building-and-deploying-custom-applications-with-apache-bigtop-and-amazon-emr/ ) for deploying custom applications on EMR cluster.

Hope it helps. Please let us know if you face any further issues. We would be happy to help you.

AWS
답변함 5년 전
0

Thanks for your answer, cool

razou
답변함 5년 전

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠