Enabling lifecycle rules on a huge S3 bucket

0

Hello all, I have a huge S3 bucket with almost 7 Billion objects and 270 Tb of data. This bucket receives almost 2 million objects per day, and the object size has an average of 37 kb. Enabling Intelligent tiering here is a no-go because it's really expensive scanning objects. I'm thinking in enabling Infrequent Access but I don't know if doing this could lead to high costs just for enabling it on this size of a bucket. Any advice on this would be helpful, thanks.

已提問 2 年前檢視次數 348 次
2 個答案
1
已接受的答案

Before choosing how you'll transition this, Take a look at S3 Storage Lens: https://docs.aws.amazon.com/AmazonS3/latest/userguide/storage_lens_basics_metrics_recommendations.html

This will better help you to understand what your usage is like so you don't transition a bunch of objects and then have to immediate incur costs to retrieve them. Once you have a good understanding of usage pattern on buckets, then I'd implement lifecycle policies. https://docs.aws.amazon.com/AmazonS3/latest/userguide/object-lifecycle-mgmt.html

profile pictureAWS
專家
Rob_H
已回答 2 年前
1

Leverage a Lambda with scheduled Cron to run the S3 Delete API for the objects. Also, follow the best practice documents for S3 with respect to Costs as below and maybe that can reduce the cost. Your use-case & data is best known by you. You can enable lifecycle rules/Intelligent Tiering approach if you feel the data in the Bucket is for the same.

Best Practices- https://docs.aws.amazon.com/AmazonS3/latest/userguide/optimizing-performance.html

Cost optimization- https://aws.amazon.com/s3/cost-optimization/

A lot can be done to reduce the costs and optimize the bucket, however it is best to also reach out to AWS Technical account manager/AWS Support who can closely work with you on saving the costs.

profile pictureAWS
支援工程師
已回答 2 年前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南