S3 PutObject rate limit reached

0

I have a S3 bucket in which keys are organized as

<user_id>/<dataset_id>/<actual data blocks>

<user_id> may take ~3250 different values. <dataset_id> are always unique.

Every second, my service write to each prefix with a data block of ~30Kb. Then I got the following error:

(SlowDown) when calling the PutObject operation (reached max retries: 4): Please reduce your request rate.

The documentation clearly stated that S3 supports "3,500 PUT/COPY/POST/DELETE or 5,500 GET/HEAD requests per second per prefix", but looks like it's not the case. S3 cannot even handle 1 request per second with 3250 concurrent writers.

Is there any problem in the way I organize my bucket keys or it's the S3 capacity/documentation issue? Thanks!

已提问 2 年前11495 查看次数
1 回答
0

Hi There

Is there any prefix before <user_id> ?

Please also take a look at this article.

https://aws.amazon.com/premiumsupport/knowledge-center/s3-503-within-request-rate-prefix/

If there is a fast spike in the request rate for objects in a prefix, Amazon S3 might return 503 Slow Down errors while it scales in the background to handle the increased request rate. To avoid these errors, you can configure your application to gradually increase the request rate and retry failed requests using an exponential backoff algorithm [1].

[1] https://docs.aws.amazon.com/general/latest/gr/api-retries.html

profile pictureAWS
专家
Matt-B
已回答 2 年前
profile picture
专家
已审核 1 个月前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则