Serverless LLMs

0

I've kinda multiple questions to ask

  1. Can I deploy a model serverlessly on Sagemaker? My current requirements are some pretrained models with around 8-12GB of weights and stuff.
  2. What is the best approach for LLM deployment and functioning on sagemaker? if I wanna plug some models from github and choose to manage them here on sagemaker. I might need fine-tuning thing along the way.
Ans
gefragt vor 3 Monaten176 Aufrufe
1 Antwort
0

Hi,

Yes, you can deploy models in serverless mode. Look at this blog post for all details: https://aws.amazon.com/blogs/machine-learning/deploying-ml-models-using-sagemaker-serverless-inference-preview/

For your own deployments, see this other blog post: https://aws.amazon.com/blogs/machine-learning/efficiently-train-tune-and-deploy-custom-ensembles-using-amazon-sagemaker/

Best,

Didier

profile pictureAWS
EXPERTE
beantwortet vor 3 Monaten

Du bist nicht angemeldet. Anmelden um eine Antwort zu veröffentlichen.

Eine gute Antwort beantwortet die Frage klar, gibt konstruktives Feedback und fördert die berufliche Weiterentwicklung des Fragenstellers.

Richtlinien für die Beantwortung von Fragen