1 Answers
0
I would suggest testing invoking your model locally first and confirming what the input your model is expecting using the saved_model
CLI. Kindly see this link: https://www.tensorflow.org/guide/saved_model#the_savedmodel_format_on_disk
Then when invoking the model confirm that instance
is in the correct input format shape your model expects.
answered 5 days ago
Relevant questions
Sagemaker Studio notebook - no module named 'tensorflow' when chosen image type "Tensorflow 2.6 Python 3.8 GPU optimized"
Accepted Answerasked 7 months agoAPI Gateway returns TARGET_MODEL_HEADER_MISSING
asked a year agoDeploy Lambda Function and API Gateway REST
asked 6 months agoInternal Server Error from API Gateway when sending queries through gateway to Lambda function connected to RDS database
asked 4 months agoHow to create (Serverless) SageMaker Endpoint using exiting tensorflow pb (frozen model) file?
asked 3 months agowhat are some ways/alternative to expose sagemaker endpoints as a HTTP /REST endpoints?
asked 5 months agoCloudFront + API Gateway AWS_IAM Authorization
Accepted Answerasked a year agoBinary uploads to API Gateway Proxy with Lambda Integration
Accepted Answerasked 6 years agoSagemaker Inference for Tensorflow Base64 Input Error through API Gateway
asked 6 days agohow to use custom_attribute in sagemaker api?
asked 5 months ago