How can I run SageMaker Serverless Inference on a GPU instance?

0

I want to run an ML model with SageMaker Serverless Inference on a GPU instance. There is no option to select the instance type. Is it possible to run on a GPU instance?

Salman
已提問 5 個月前檢視次數 1086 次
1 個回答
0

Unfortunately GPU based inference isn't currently supported on SageMaker Serverless Inference. From the feature exclusions section of the serverless endpoints documentation:

Some of the features currently available for SageMaker Real-time Inference are not supported for Serverless Inference, including GPUs, AWS marketplace model packages, private Docker registries, Multi-Model Endpoints, VPC configuration, network isolation, data capture, multiple production variants, Model Monitor, and inference pipelines.

Link here: https://docs.aws.amazon.com/sagemaker/latest/dg/serverless-endpoints.html

AWS
AWSJoe
已回答 5 個月前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南