Serverless LLMs

0

I've kinda multiple questions to ask

  1. Can I deploy a model serverlessly on Sagemaker? My current requirements are some pretrained models with around 8-12GB of weights and stuff.
  2. What is the best approach for LLM deployment and functioning on sagemaker? if I wanna plug some models from github and choose to manage them here on sagemaker. I might need fine-tuning thing along the way.
Ans
已提問 3 個月前檢視次數 176 次
1 個回答
0

Hi,

Yes, you can deploy models in serverless mode. Look at this blog post for all details: https://aws.amazon.com/blogs/machine-learning/deploying-ml-models-using-sagemaker-serverless-inference-preview/

For your own deployments, see this other blog post: https://aws.amazon.com/blogs/machine-learning/efficiently-train-tune-and-deploy-custom-ensembles-using-amazon-sagemaker/

Best,

Didier

profile pictureAWS
專家
已回答 3 個月前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南