Serverless LLMs

0

I've kinda multiple questions to ask

  1. Can I deploy a model serverlessly on Sagemaker? My current requirements are some pretrained models with around 8-12GB of weights and stuff.
  2. What is the best approach for LLM deployment and functioning on sagemaker? if I wanna plug some models from github and choose to manage them here on sagemaker. I might need fine-tuning thing along the way.
Ans
질문됨 3달 전176회 조회
1개 답변
0

Hi,

Yes, you can deploy models in serverless mode. Look at this blog post for all details: https://aws.amazon.com/blogs/machine-learning/deploying-ml-models-using-sagemaker-serverless-inference-preview/

For your own deployments, see this other blog post: https://aws.amazon.com/blogs/machine-learning/efficiently-train-tune-and-deploy-custom-ensembles-using-amazon-sagemaker/

Best,

Didier

profile pictureAWS
전문가
답변함 3달 전

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠