1 Antwort
- Neueste
- Die meisten Stimmen
- Die meisten Kommentare
0
Hi,
Do you have Instance_type = "local_gpu"
?
When you have "local", the model may default to CPU instead of GPU.
Best,
Didier
Relevanter Inhalt
- AWS OFFICIALAktualisiert vor 7 Monaten
- AWS OFFICIALAktualisiert vor 2 Jahren
- AWS OFFICIALAktualisiert vor 2 Jahren
- AWS OFFICIALAktualisiert vor 2 Jahren
as I said earlier.
All the things are in place model is loaded in GPU. also, all the GPUs are showing that vRAM is used. But, when I start inferencing using the model only one GPU is used to process. GPU processors are not used except for the first GPU. please refer to the 2nd image("GPU-Util") I have attached in question.
Thanks Didier.