Spark with scala 2.12.x on EMR

0

Is it a way to configure our emr-cluster to use scala 2.12.x insttead of scala 2.11.x with Spark 2.4.x ?

Edited by: razou on May 24, 2019 9:20 AM

razou
質問済み 5年前1357ビュー
2回答
0

Hi,

On EMR, Spark is built with Scala-2.11.x, which is currently the stable version. As per- https://spark.apache.org/releases/spark-release-2-4-0.html , Scala-2.12 is still under experimental support.
Our service team is already aware of this feature request, and they shall be adding Scala-2.12.0 support in coming releases, once it becomes stable. We don't have an ETA for this currently. AWS product teams regularly look into feature requests and bug reports, however, they have the final say on when to implement or fix them according to their own R&D schedules. Any new changes for EMR can be tracked at "https://docs.aws.amazon.com/emr/latest/ReleaseGuide/emr-whatsnew.html ".

WORKAROUND:
For Workaround, you may build your own Spark and install it on EMR cluster as an application. EMR used Apache BigTop libraries to install applications(like Spark, hive etc.). Please refer(https://aws.amazon.com/blogs/big-data/building-and-deploying-custom-applications-with-apache-bigtop-and-amazon-emr/ ) for deploying custom applications on EMR cluster.

Hope it helps. Please let us know if you face any further issues. We would be happy to help you.

AWS
回答済み 5年前
0

Thanks for your answer, cool

razou
回答済み 5年前

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ