Can I attach an EI accelerator to a multi model server?

0

Hey,

After discovering ml.inf machines not supporting multi-model servers (MMS), we were forced to re-deploy our models. Currently, we are struggling to use MMS with EI accelerators. Is it an issue on our side, or are these accelerators not supported on MMS?

ClientError: An error occurred (ValidationException) when calling the CreateModel operation: Your Ecr Image pytorch-inference-eia:1.5.1-cpu-py3 does not contain required com.amazonaws.sagemaker.capabilities.multi-models=true Docker label(s).

Thanks!