S3 PutObject rate limit reached

0

I have a S3 bucket in which keys are organized as

<user_id>/<dataset_id>/<actual data blocks>

<user_id> may take ~3250 different values. <dataset_id> are always unique.

Every second, my service write to each prefix with a data block of ~30Kb. Then I got the following error:

(SlowDown) when calling the PutObject operation (reached max retries: 4): Please reduce your request rate.

The documentation clearly stated that S3 supports "3,500 PUT/COPY/POST/DELETE or 5,500 GET/HEAD requests per second per prefix", but looks like it's not the case. S3 cannot even handle 1 request per second with 3250 concurrent writers.

Is there any problem in the way I organize my bucket keys or it's the S3 capacity/documentation issue? Thanks!

已提問 2 年前檢視次數 11337 次
1 個回答
0

Hi There

Is there any prefix before <user_id> ?

Please also take a look at this article.

https://aws.amazon.com/premiumsupport/knowledge-center/s3-503-within-request-rate-prefix/

If there is a fast spike in the request rate for objects in a prefix, Amazon S3 might return 503 Slow Down errors while it scales in the background to handle the increased request rate. To avoid these errors, you can configure your application to gradually increase the request rate and retry failed requests using an exponential backoff algorithm [1].

[1] https://docs.aws.amazon.com/general/latest/gr/api-retries.html

profile pictureAWS
專家
Matt-B
已回答 2 年前
profile picture
專家
已審閱 23 天前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南