Can I attach an EI accelerator to a multi model server?

0

Hey,

After discovering ml.inf machines not supporting multi-model servers (MMS), we were forced to re-deploy our models. Currently, we are struggling to use MMS with EI accelerators. Is it an issue on our side, or are these accelerators not supported on MMS?

ClientError: An error occurred (ValidationException) when calling the CreateModel operation: Your Ecr Image pytorch-inference-eia:1.5.1-cpu-py3 does not contain required com.amazonaws.sagemaker.capabilities.multi-models=true Docker label(s).

Thanks!

沒有答案

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南