- Più recenti
- Maggior numero di voti
- Maggior numero di commenti
Hello,
So the issue here is the predictor.predict command converts the data to the format necessary for the endpoint to understand, thus you need to serialize or encode the payload by yourself. To do this you can work with something like json.dumps(payload) or for a byte array json.dumps(payload).encode().
If you want to use the predictor class this is taken care of by the serializer option. The serializer encodes/decodes the data for us and lets you simply call the endpoint through the predictor class. An example of this is the following code snippet:
from sagemaker.serializers import IdentitySerializer
from sagemaker.deserializers import JSONDeserializer
serializer=IdentitySerializer(content_type="application/json")
Hope this helps!
To check out the various serializer options that can work for your different use cases check the following link.
Serializers: https://sagemaker.readthedocs.io/en/stable/api/inference/serializers.html
Edited by: rvegira-aws on Jul 22, 2021 9:22 AM
Edited by: rvegira-aws on Jul 22, 2021 9:24 AM
Thanks rvegira-aws,
I changed the approach, instead of using the "invoke_endpoint" method, I have used the predictor class as you suggested and this has fixed the issue.
Regards.
I faced the exact problem when building models for my website, thanks for the question
Contenuto pertinente
- AWS UFFICIALEAggiornata 2 anni fa
- AWS UFFICIALEAggiornata 4 mesi fa
- AWS UFFICIALEAggiornata 3 anni fa