SparkR not working

0

I am trying to control a Spark cluster (using SparkR) from a Sagemaker notebook. I followed these instructions closely: https://aws.amazon.com/blogs/machine-learning/build-amazon-sagemaker-notebooks-backed-by-spark-in-amazon-emr/ and got it to work.

Today when I try to run the SageMaker notebook (using the exact same code as before) I inexplicably get this error:

An error was encountered:
[1] "Error in callJMethod(sparkSession, \"read\"): Invalid jobj 1. If SparkR was restarted, Spark operations need to be re-executed."

Does anyone know why this is? I terminated the SparkR kernel and am still getting this error.

AWS
ESPECIALISTA
feita há 4 anos272 visualizações
1 Resposta
0
Resposta aceita

You cannot have multiple SparkContexts in one JVM. The issue is resolved as WON'T FIX. You have to stop the spark session which spawned the sparkcontext (which you have already done).

sparkR.session.stop()

https://issues.apache.org/jira/browse/SPARK-2243

respondido há 4 anos

Você não está conectado. Fazer login para postar uma resposta.

Uma boa resposta responde claramente à pergunta, dá feedback construtivo e incentiva o crescimento profissional de quem perguntou.

Diretrizes para responder a perguntas