I would suggest testing invoking your model locally first and confirming what the input your model is expecting using the
saved_model CLI. Kindly see this link: https://www.tensorflow.org/guide/saved_model#the_savedmodel_format_on_disk
Then when invoking the model confirm that
instance is in the correct input format shape your model expects.
Sagemaker Studio notebook - no module named 'tensorflow' when chosen image type "Tensorflow 2.6 Python 3.8 GPU optimized"Accepted Answerasked 7 months ago
API Gateway returns TARGET_MODEL_HEADER_MISSINGasked a year ago
Deploy Lambda Function and API Gateway RESTasked 6 months ago
Internal Server Error from API Gateway when sending queries through gateway to Lambda function connected to RDS databaseasked 4 months ago
How to create (Serverless) SageMaker Endpoint using exiting tensorflow pb (frozen model) file?asked 3 months ago
what are some ways/alternative to expose sagemaker endpoints as a HTTP /REST endpoints?
CloudFront + API Gateway AWS_IAM AuthorizationAccepted Answerasked a year ago
Binary uploads to API Gateway Proxy with Lambda IntegrationAccepted Answerasked 6 years ago
Sagemaker Inference for Tensorflow Base64 Input Error through API Gatewayasked 6 days ago
how to use custom_attribute in sagemaker api?