AWS S3 batch - How is throttling handled and is cost per request?

0

We are planning to call Restore API using S3 batch. I have two questions for which i couldn't find answer at forum or in documentation

  1. How does AWS S3 batch handle the throttling errors ? What is the throttling behavior for S3 restore API, as in whats the default limit per account?
  2. If a s3 batch job has 10K s3 objects, will customer be charged for 10k requests to S3 or just for the S3 batch job?

Edited by: SarthakAWS on Apr 23, 2020 9:20 AM

已提问 4 年前771 查看次数
1 回答
0

Thanks for your questions.

On the first one, S3 Batch Operations manages retries on your behalf, so any throttling is handled by varying the job's rate of execution. The overall rate of execution will depend on a number of factors in the case of S3 Glacier or S3 Glacier Deep Archive restores, including other application traffic outside of S3 Batch Operations that may also be issuing restore requests at that time.

On your pricing question, for 10k objects you would pay the S3 Batch Operations job fee, S3 Batch Operations objects fees for 10k objects, Glacier restore requests for 10k objects, and the byte retrieval fee for all bytes retrieved. You can use the bulk retrieval tier to save on retrieval fees: https://aws.amazon.com/s3/pricing/

Edited by: robwaws on Apr 28, 2020 6:37 AM

AWS
awsrwx
已回答 4 年前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则