Unable to create spark context in jupyterhub

0

When creating the SC, it keeps failing with fatal error though restarted the kernel multiple times. As I am data analyst, not sure what else to check in terms of resources

Sri
asked 7 months ago191 views
1 Answer
4
Accepted Answer

Hello, As you are getting fatal error when creating the SparkContext, then the issue might be in acquiring resources like yarn memory or CPU. Please check the memory and cpu availability of the current status in the cloudwatch and yarn resource manager UI/logs. If there are already applications running which took over the cluster resources, then you might not be able to create new application in JNB. Also, check the HDFS utilization, MR healthy node status in the Cloudwatch metrics. If the nodes are not available to take the containers, then new application might fail to initiate. Based on the resource checks stated above, please add more nodes or use auto scaling or add additional EBS volumes if you find that the resource constraint in the EMR nodes. Here is the EMR best practice for reference.

AWS
SUPPORT ENGINEER
answered 7 months ago
  • Thanks a ton!!! this is helpful

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions