Can I attach an EI accelerator to a multi model server?



After discovering ml.inf machines not supporting multi-model servers (MMS), we were forced to re-deploy our models. Currently, we are struggling to use MMS with EI accelerators. Is it an issue on our side, or are these accelerators not supported on MMS?

ClientError: An error occurred (ValidationException) when calling the CreateModel operation: Your Ecr Image pytorch-inference-eia:1.5.1-cpu-py3 does not contain required com.amazonaws.sagemaker.capabilities.multi-models=true Docker label(s).


No Answers

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions