S3 PutObject rate limit reached

0

I have a S3 bucket in which keys are organized as

<user_id>/<dataset_id>/<actual data blocks>

<user_id> may take ~3250 different values. <dataset_id> are always unique.

Every second, my service write to each prefix with a data block of ~30Kb. Then I got the following error:

(SlowDown) when calling the PutObject operation (reached max retries: 4): Please reduce your request rate.

The documentation clearly stated that S3 supports "3,500 PUT/COPY/POST/DELETE or 5,500 GET/HEAD requests per second per prefix", but looks like it's not the case. S3 cannot even handle 1 request per second with 3250 concurrent writers.

Is there any problem in the way I organize my bucket keys or it's the S3 capacity/documentation issue? Thanks!

질문됨 2년 전11321회 조회
1개 답변
0

Hi There

Is there any prefix before <user_id> ?

Please also take a look at this article.

https://aws.amazon.com/premiumsupport/knowledge-center/s3-503-within-request-rate-prefix/

If there is a fast spike in the request rate for objects in a prefix, Amazon S3 might return 503 Slow Down errors while it scales in the background to handle the increased request rate. To avoid these errors, you can configure your application to gradually increase the request rate and retry failed requests using an exponential backoff algorithm [1].

[1] https://docs.aws.amazon.com/general/latest/gr/api-retries.html

profile pictureAWS
전문가
Matt-B
답변함 2년 전
profile picture
전문가
검토됨 22일 전

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠