how to increase the storage of host instance

0

The parameters of a big neural network model can be huge. But the largest storage size of a host instance is only 30G, according to https://docs.aws.amazon.com/sagemaker/latest/dg/host-instance-storage.html. Is there a way to increase the storage volume? I have a model (embeddings) that is very close to 30G and caused a no space error when deploying.

Thanks!

bill10
已提问 4 年前1753 查看次数
2 回答
0
已接受的回答

The disk size is currently not configurable for SageMaker Endpoints with EBS backed volumes. As a workaround, please use instances with ephemeral storage for your SageMaker endpoint.

Example instance types with ephemeral storage:

The full list of Amazon SageMaker instance types can be accessed here: https://aws.amazon.com/sagemaker/pricing/instance-types/

dlragha
已回答 4 年前
0

Thanks! Using a x5d instance solves the issue.

And a quick note: even though I could download the big model to endpoint now, I got timeout error when loading the model in the endpoint. After some trial and error, I solved it by increasing the timeout value and reducing the number of worker by setting the environment variables. To do it, pass this dict

env={"SAGEMAKER_MODEL_SERVER_WORKERS":"1",
"SAGEMAKER_MODEL_SERVER_TIMEOUT":"1800"}

when creating the model object.

bill10
已回答 4 年前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则