2 Answers
- Newest
- Most votes
- Most comments
0
Hi THere
For S3Uri
it looks like you are passing the string ./input.json
. Have you tried passing the entire S3 URI?
S3Uri
Depending on the value specified for the S3DataType, identifies either a key name prefix or a manifest. For example:
A key name prefix might look like this: s3://bucketname/exampleprefix/
0
Hi @Matt-B
That seems to have fixed it the initial processing but it still fails on parsing the input.
I've changed it to an S3 directory prefix of: s3:://bucketname/inputjsons/
. In this directory, I've placed the single input.json file for now.
Once I run the job, it seems to POST to the /invocations
endpoint correctly. I can see the logs of it running the model.
However, the overall job fails with a 400 Bad Request Error.
[sagemaker logs]: MaxConcurrentTransforms=100, MaxPayloadInMB=1, BatchStrategy=SINGLE_RECORD
[sagemaker logs]: bucketname/inputjsons/input.json: ClientError: 400
[sagemaker logs]: bucketname/inputjsons/input.json:
[sagemaker logs]: bucketname/inputjsons/input.json: Message:
[sagemaker logs]: bucketname/inputjsons/input.json: <!doctype html>
[sagemaker logs]: bucketname/inputjsons/input.json: <html lang=en>
[sagemaker logs]: bucketname/inputjsons/input.json: <title>400 Bad Request</title>
[sagemaker logs]: bucketname/inputjsons/input.json: <h1>Bad Request</h1>
[sagemaker logs]: bucketname/inputjsons/input.json: <p>The browser (or proxy) sent a request that this server could not understand.</p>
answered 25 days ago
Relevant content
- asked 7 months ago
- asked 4 months ago
- AWS OFFICIALUpdated 2 years ago
- AWS OFFICIALUpdated 2 years ago
- AWS OFFICIALUpdated 8 months ago
- AWS OFFICIALUpdated 2 years ago