Why does my kernal keep dying when I try to import Hugging Face BERT models to Amazon SageMaker?

1

When I try to import Hugging Face BERT models to the conda_pytorch_p36 kernal of my Amazon SageMaker Notebook instance using the following pip command, the kernal always dies:

! pip install transformers

The result is the same for Hugging Face BERT, RoBERTa, and GPT2 models on ml.c5.2xlarge and ml.c5d.4xlarge Amazon SageMaker instances.

Why is this happening, and how do I resolve the issue?

AWS
專家
已提問 4 年前檢視次數 1789 次
2 個答案
0
已接受的答案

This issue occurs when the latest sentence piece breaks. The workaround is to force install sentencepiece==0.1.91.

pip install sentencepiece==0.1.91

AWS
專家
已回答 4 年前
0

Any update on this I am trying to download "GPT-J-6B" model on the sagemaker and getting the dying kernel again and again. I have explained my issue on this question.

And installing sentencepieve==0.1.91 gives the error. I have also checked from the pypi, the syntax and versions are correct but returns: subprocess-exited-with-error in pip install error.

Any update on this?

(Please view the link given above for my reproducible code) Thanks.

EM_User
已回答 1 年前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南