How can I run SageMaker Serverless Inference on a GPU instance?

0

I want to run an ML model with SageMaker Serverless Inference on a GPU instance. There is no option to select the instance type. Is it possible to run on a GPU instance?

Salman
已提问 5 个月前1096 查看次数
1 回答
0

Unfortunately GPU based inference isn't currently supported on SageMaker Serverless Inference. From the feature exclusions section of the serverless endpoints documentation:

Some of the features currently available for SageMaker Real-time Inference are not supported for Serverless Inference, including GPUs, AWS marketplace model packages, private Docker registries, Multi-Model Endpoints, VPC configuration, network isolation, data capture, multiple production variants, Model Monitor, and inference pipelines.

Link here: https://docs.aws.amazon.com/sagemaker/latest/dg/serverless-endpoints.html

AWS
AWSJoe
已回答 5 个月前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则