The SQS Extended Client is used to facilitate sending a larger payload in SQS. SQS supports messages up to 256KB. If you need to send larger payloads, we recommend storing these payloads in S3 and send the object key in the SQS message. the extended client does it for you.
It is difficult to say from the info provided what is the issue, as we do not know why it failed to get the object from S3. One option is that the consumer doesn't have the right policy in its role to access S3. If this is the case, all messages containing large payloads will fail.
I would recommend to looking at the role and if that is not the issue, check CloudTrail to see what are the errors.
Hi Uri, Thanks for responding. I should have made clear that not all messages have this problem - so its not anything to do with the bucket policy.
The exception is thrown when the extended client fails to find the content on S3 - ie the key doesn't exist. I've no idea why the key doesn't exist - i'm not getting any exceptions thrown on the publishing side of the queue.
Enabling cloudtrail - just confirms that the extended client code can't find the key on S3.
Behind stage, your are pushing a file into the s3 , knowing that the s3 is a global service and not really strong consistent . "Amazon S3 provides read-after-write consistency for PUTS of new objects in your S3 bucket in all regions with one caveat." Ignoring the caveat, this means that a client issuing a GET following a PUT for a new object is guaranteed to get the correct result.
Based on documentation you get object asap the write is terminated since 2021.
But, My thought is that you are consuming the message to early before that the S3 key be available globally.
Try Adding some delay to your message when pushing to SQS , using DelaySeconds https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-delay-queues.html
this will be better solution letting s3 object be replicated.
Try also to check if behind the write the object persists really in s3
SQS - Extended Client - Too many ERROR [S3Dao] Failed to get the S3 object which contains the payload.asked 2 months ago
Athena Error No: 100, HTTP Response Code: 502, Error Message: Failed to parse error payloadasked 2 days ago
Got access to the Amazon PA API but always get Error 429 Too many requestsasked 8 months ago
Uploading greater than 6 MB file to S3 through API Gateway results in Request too long error, Is that expected ?asked 6 months ago
The following resource(s) failed to create: [CitrixCloudConnectors]asked 4 months ago
Amplify Push Fail - Appsync API error - S3 The specified key does not existasked 6 months ago
Error connection to the API!asked 3 years ago
Build failed to start. The following error occurred: Cannot have more than 0 builds in queue for the accountAccepted Answerasked 9 months ago
What happens if SQS messages partially fail in SendMessageBatch?asked 2 months ago
Return Value from Lambda function triggered by SQS to individual clientAccepted Answerasked 8 months ago