How can I run SageMaker Serverless Inference on a GPU instance?

0

I want to run an ML model with SageMaker Serverless Inference on a GPU instance. There is no option to select the instance type. Is it possible to run on a GPU instance?

Salman
질문됨 5달 전1086회 조회
1개 답변
0

Unfortunately GPU based inference isn't currently supported on SageMaker Serverless Inference. From the feature exclusions section of the serverless endpoints documentation:

Some of the features currently available for SageMaker Real-time Inference are not supported for Serverless Inference, including GPUs, AWS marketplace model packages, private Docker registries, Multi-Model Endpoints, VPC configuration, network isolation, data capture, multiple production variants, Model Monitor, and inference pipelines.

Link here: https://docs.aws.amazon.com/sagemaker/latest/dg/serverless-endpoints.html

AWS
AWSJoe
답변함 5달 전

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠