[Feature Request] Serverless Inference with VPC Config

1

I would like to use a Sagemaker Model with a custom VPC Configuration, which is currently not possible with Serverless Inference. Is this feature planned? More generally: Is there a roadmap somewhere for Serverless Inference?

Richard
질문됨 2년 전843회 조회
3개 답변
0

Any updates on this?

Btw, you should add a warning to the documentation of Model.deploy() here: https://sagemaker.readthedocs.io/en/v2.169.0/api/inference/model.html as I've been getting a ValidationException and I've been trying to debug it for hours, without a clue for why it's failing. I am also using a VPC config. Honestly AWS, fix your damn documentation.

mpw
답변함 10달 전
  • Hi! We also lost several hours because of incomplete and misleading documentation. What we ended up doing is to create a VPC network for our Redis Cluster which Sagemaker was not part of and handled the caching in the API calling the Sagemaker endpoint. Maybe you can use a similar approach.

0

SageMaker Serverless Inference is currently in preview and VPC support is not available but as the feature you are asking for is an important one and is on the roadmap( unfortunately I cannot share the exact details of the timelines here)

AWS
답변함 2년 전
0

How long will this be on preview. I hope when it comes out, it will have support for VPC

답변함 일 년 전

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠