S3 PutObject rate limit reached

0

I have a S3 bucket in which keys are organized as

<user_id>/<dataset_id>/<actual data blocks>

<user_id> may take ~3250 different values. <dataset_id> are always unique.

Every second, my service write to each prefix with a data block of ~30Kb. Then I got the following error:

(SlowDown) when calling the PutObject operation (reached max retries: 4): Please reduce your request rate.

The documentation clearly stated that S3 supports "3,500 PUT/COPY/POST/DELETE or 5,500 GET/HEAD requests per second per prefix", but looks like it's not the case. S3 cannot even handle 1 request per second with 3250 concurrent writers.

Is there any problem in the way I organize my bucket keys or it's the S3 capacity/documentation issue? Thanks!

質問済み 2年前11339ビュー
1回答
0

Hi There

Is there any prefix before <user_id> ?

Please also take a look at this article.

https://aws.amazon.com/premiumsupport/knowledge-center/s3-503-within-request-rate-prefix/

If there is a fast spike in the request rate for objects in a prefix, Amazon S3 might return 503 Slow Down errors while it scales in the background to handle the increased request rate. To avoid these errors, you can configure your application to gradually increase the request rate and retry failed requests using an exponential backoff algorithm [1].

[1] https://docs.aws.amazon.com/general/latest/gr/api-retries.html

profile pictureAWS
エキスパート
Matt-B
回答済み 2年前
profile picture
エキスパート
レビュー済み 24日前

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ