SparkR not working

0

I am trying to control a Spark cluster (using SparkR) from a Sagemaker notebook. I followed these instructions closely: https://aws.amazon.com/blogs/machine-learning/build-amazon-sagemaker-notebooks-backed-by-spark-in-amazon-emr/ and got it to work.

Today when I try to run the SageMaker notebook (using the exact same code as before) I inexplicably get this error:

An error was encountered:
[1] "Error in callJMethod(sparkSession, \"read\"): Invalid jobj 1. If SparkR was restarted, Spark operations need to be re-executed."

Does anyone know why this is? I terminated the SparkR kernel and am still getting this error.

AWS
エキスパート
質問済み 4年前272ビュー
1回答
0
承認された回答

You cannot have multiple SparkContexts in one JVM. The issue is resolved as WON'T FIX. You have to stop the spark session which spawned the sparkcontext (which you have already done).

sparkR.session.stop()

https://issues.apache.org/jira/browse/SPARK-2243

回答済み 4年前

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ