1 Risposta
- Più recenti
- Maggior numero di voti
- Maggior numero di commenti
1
Hello, you could indeed extend the pre-built Container and install your required Python version together with your other libraries. That way, everything needed for inference would be pre-install and you would just add the extra libraries that you need. I would run a test locally to see if that works for you. You could also see if your model can work with one of the newer DLC images that comes with python3.8
.
con risposta un anno fa
Contenuto pertinente
- AWS UFFICIALEAggiornata 2 anni fa
- Perché il mio endpoint Amazon SageMaker entra in stato di errore quando creo o aggiorno un endpoint?AWS UFFICIALEAggiornata un anno fa
- AWS UFFICIALEAggiornata un anno fa
- AWS UFFICIALEAggiornata un anno fa
Thank you so much for your answer dear Christian! Appreciate it! I will try to extend the pre-built container and let you know. Do you know by any chance if there's any documentation / tutorial I can follow for it?
Furthermore, can I generally assume that this approach is best practice? For example, in my work I often have the task to find suitable pre-trained ML models on github and implement them in endpoints. The repos generally share a conda environment with a specific python version and a set of libraries. In order for me to use these exact same requirements of python version + libraries, I would need to extend a pre-built container as mentioned above with the right python version and its requirements. Right?
And this also removes the need to have a
requirements.txt
file alongside myinference.py
fileIt will all depends on the specific use case. Baking the extra dependencies in the container increase the side of your docker image and bigger images can take longer to be downloaded for ECR. There may be scenario where it is better to install the libraries at run-time via a
requirements.txt
. Similarly forinference.py
you might have a use case that requires you to just make changes to the model loading method (model_fn()) without the need to extend the pre-built containers. To sum it up, it will depends on the use-case.Here is an example of how to extend our PyTorch container. Also have a look at the documentation here
Hope it helps.
Sounds good, thank you so much! Appreciate your help!