1개 답변
- 최신
- 최다 투표
- 가장 많은 댓글
0
You cannot have multiple SparkContexts in one JVM. The issue is resolved as WON'T FIX. You have to stop the spark session which spawned the sparkcontext (which you have already done).
sparkR.session.stop()
답변함 4년 전
You cannot have multiple SparkContexts in one JVM. The issue is resolved as WON'T FIX. You have to stop the spark session which spawned the sparkcontext (which you have already done).
sparkR.session.stop()