Spark with scala 2.12.x on EMR

0

Is it a way to configure our emr-cluster to use scala 2.12.x insttead of scala 2.11.x with Spark 2.4.x ?

Edited by: razou on May 24, 2019 9:20 AM

razou
asked 5 years ago1344 views
2 Answers
0

Hi,

On EMR, Spark is built with Scala-2.11.x, which is currently the stable version. As per- https://spark.apache.org/releases/spark-release-2-4-0.html , Scala-2.12 is still under experimental support.
Our service team is already aware of this feature request, and they shall be adding Scala-2.12.0 support in coming releases, once it becomes stable. We don't have an ETA for this currently. AWS product teams regularly look into feature requests and bug reports, however, they have the final say on when to implement or fix them according to their own R&D schedules. Any new changes for EMR can be tracked at "https://docs.aws.amazon.com/emr/latest/ReleaseGuide/emr-whatsnew.html ".

WORKAROUND:
For Workaround, you may build your own Spark and install it on EMR cluster as an application. EMR used Apache BigTop libraries to install applications(like Spark, hive etc.). Please refer(https://aws.amazon.com/blogs/big-data/building-and-deploying-custom-applications-with-apache-bigtop-and-amazon-emr/ ) for deploying custom applications on EMR cluster.

Hope it helps. Please let us know if you face any further issues. We would be happy to help you.

AWS
answered 5 years ago
0

Thanks for your answer, cool

razou
answered 5 years ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions