S3 PutObject rate limit reached

0

I have a S3 bucket in which keys are organized as

<user_id>/<dataset_id>/<actual data blocks>

<user_id> may take ~3250 different values. <dataset_id> are always unique.

Every second, my service write to each prefix with a data block of ~30Kb. Then I got the following error:

(SlowDown) when calling the PutObject operation (reached max retries: 4): Please reduce your request rate.

The documentation clearly stated that S3 supports "3,500 PUT/COPY/POST/DELETE or 5,500 GET/HEAD requests per second per prefix", but looks like it's not the case. S3 cannot even handle 1 request per second with 3250 concurrent writers.

Is there any problem in the way I organize my bucket keys or it's the S3 capacity/documentation issue? Thanks!

demandé il y a 2 ans11472 vues
1 réponse
0

Hi There

Is there any prefix before <user_id> ?

Please also take a look at this article.

https://aws.amazon.com/premiumsupport/knowledge-center/s3-503-within-request-rate-prefix/

If there is a fast spike in the request rate for objects in a prefix, Amazon S3 might return 503 Slow Down errors while it scales in the background to handle the increased request rate. To avoid these errors, you can configure your application to gradually increase the request rate and retry failed requests using an exponential backoff algorithm [1].

[1] https://docs.aws.amazon.com/general/latest/gr/api-retries.html

profile pictureAWS
EXPERT
Matt-B
répondu il y a 2 ans
profile picture
EXPERT
vérifié il y a un mois

Vous n'êtes pas connecté. Se connecter pour publier une réponse.

Une bonne réponse répond clairement à la question, contient des commentaires constructifs et encourage le développement professionnel de la personne qui pose la question.

Instructions pour répondre aux questions