Can I attach an EI accelerator to a multi model server?

0

Hey,

After discovering ml.inf machines not supporting multi-model servers (MMS), we were forced to re-deploy our models. Currently, we are struggling to use MMS with EI accelerators. Is it an issue on our side, or are these accelerators not supported on MMS?

ClientError: An error occurred (ValidationException) when calling the CreateModel operation: Your Ecr Image pytorch-inference-eia:1.5.1-cpu-py3 does not contain required com.amazonaws.sagemaker.capabilities.multi-models=true Docker label(s).

Thanks!

답변 없음

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠