AWS S3 batch - How is throttling handled and is cost per request?

0

We are planning to call Restore API using S3 batch. I have two questions for which i couldn't find answer at forum or in documentation

  1. How does AWS S3 batch handle the throttling errors ? What is the throttling behavior for S3 restore API, as in whats the default limit per account?
  2. If a s3 batch job has 10K s3 objects, will customer be charged for 10k requests to S3 or just for the S3 batch job?

Edited by: SarthakAWS on Apr 23, 2020 9:20 AM

已提問 4 年前檢視次數 772 次
1 個回答
0

Thanks for your questions.

On the first one, S3 Batch Operations manages retries on your behalf, so any throttling is handled by varying the job's rate of execution. The overall rate of execution will depend on a number of factors in the case of S3 Glacier or S3 Glacier Deep Archive restores, including other application traffic outside of S3 Batch Operations that may also be issuing restore requests at that time.

On your pricing question, for 10k objects you would pay the S3 Batch Operations job fee, S3 Batch Operations objects fees for 10k objects, Glacier restore requests for 10k objects, and the byte retrieval fee for all bytes retrieved. You can use the bulk retrieval tier to save on retrieval fees: https://aws.amazon.com/s3/pricing/

Edited by: robwaws on Apr 28, 2020 6:37 AM

AWS
awsrwx
已回答 4 年前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南