How can I run SageMaker Serverless Inference on a GPU instance?

0

I want to run an ML model with SageMaker Serverless Inference on a GPU instance. There is no option to select the instance type. Is it possible to run on a GPU instance?

Salman
質問済み 5ヶ月前1087ビュー
1回答
0

Unfortunately GPU based inference isn't currently supported on SageMaker Serverless Inference. From the feature exclusions section of the serverless endpoints documentation:

Some of the features currently available for SageMaker Real-time Inference are not supported for Serverless Inference, including GPUs, AWS marketplace model packages, private Docker registries, Multi-Model Endpoints, VPC configuration, network isolation, data capture, multiple production variants, Model Monitor, and inference pipelines.

Link here: https://docs.aws.amazon.com/sagemaker/latest/dg/serverless-endpoints.html

AWS
AWSJoe
回答済み 5ヶ月前

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ