AWS S3 batch - How is throttling handled and is cost per request?

0

We are planning to call Restore API using S3 batch. I have two questions for which i couldn't find answer at forum or in documentation

  1. How does AWS S3 batch handle the throttling errors ? What is the throttling behavior for S3 restore API, as in whats the default limit per account?
  2. If a s3 batch job has 10K s3 objects, will customer be charged for 10k requests to S3 or just for the S3 batch job?

Edited by: SarthakAWS on Apr 23, 2020 9:20 AM

asked 4 years ago732 views
1 Answer
0

Thanks for your questions.

On the first one, S3 Batch Operations manages retries on your behalf, so any throttling is handled by varying the job's rate of execution. The overall rate of execution will depend on a number of factors in the case of S3 Glacier or S3 Glacier Deep Archive restores, including other application traffic outside of S3 Batch Operations that may also be issuing restore requests at that time.

On your pricing question, for 10k objects you would pay the S3 Batch Operations job fee, S3 Batch Operations objects fees for 10k objects, Glacier restore requests for 10k objects, and the byte retrieval fee for all bytes retrieved. You can use the bulk retrieval tier to save on retrieval fees: https://aws.amazon.com/s3/pricing/

Edited by: robwaws on Apr 28, 2020 6:37 AM

AWS
awsrwx
answered 4 years ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions