3 Risposte
- Più recenti
- Maggior numero di voti
- Maggior numero di commenti
0
Any updates on this?
Btw, you should add a warning to the documentation of Model.deploy() here: https://sagemaker.readthedocs.io/en/v2.169.0/api/inference/model.html as I've been getting a ValidationException and I've been trying to debug it for hours, without a clue for why it's failing. I am also using a VPC config. Honestly AWS, fix your damn documentation.
con risposta 10 mesi fa
0
SageMaker Serverless Inference is currently in preview and VPC support is not available but as the feature you are asking for is an important one and is on the roadmap( unfortunately I cannot share the exact details of the timelines here)
con risposta 2 anni fa
0
How long will this be on preview. I hope when it comes out, it will have support for VPC
con risposta un anno fa
Contenuto pertinente
- AWS UFFICIALEAggiornata 4 mesi fa
- AWS UFFICIALEAggiornata un anno fa
- AWS UFFICIALEAggiornata 3 anni fa
Hi! We also lost several hours because of incomplete and misleading documentation. What we ended up doing is to create a VPC network for our Redis Cluster which Sagemaker was not part of and handled the caching in the API calling the Sagemaker endpoint. Maybe you can use a similar approach.