SparkR not working

0

I am trying to control a Spark cluster (using SparkR) from a Sagemaker notebook. I followed these instructions closely: https://aws.amazon.com/blogs/machine-learning/build-amazon-sagemaker-notebooks-backed-by-spark-in-amazon-emr/ and got it to work.

Today when I try to run the SageMaker notebook (using the exact same code as before) I inexplicably get this error:

An error was encountered:
[1] "Error in callJMethod(sparkSession, \"read\"): Invalid jobj 1. If SparkR was restarted, Spark operations need to be re-executed."

Does anyone know why this is? I terminated the SparkR kernel and am still getting this error.

AWS
EXPERT
demandé il y a 4 ans272 vues
1 réponse
0
Réponse acceptée

You cannot have multiple SparkContexts in one JVM. The issue is resolved as WON'T FIX. You have to stop the spark session which spawned the sparkcontext (which you have already done).

sparkR.session.stop()

https://issues.apache.org/jira/browse/SPARK-2243

répondu il y a 4 ans

Vous n'êtes pas connecté. Se connecter pour publier une réponse.

Une bonne réponse répond clairement à la question, contient des commentaires constructifs et encourage le développement professionnel de la personne qui pose la question.

Instructions pour répondre aux questions