Enabling lifecycle rules on a huge S3 bucket

0

Hello all, I have a huge S3 bucket with almost 7 Billion objects and 270 Tb of data. This bucket receives almost 2 million objects per day, and the object size has an average of 37 kb. Enabling Intelligent tiering here is a no-go because it's really expensive scanning objects. I'm thinking in enabling Infrequent Access but I don't know if doing this could lead to high costs just for enabling it on this size of a bucket. Any advice on this would be helpful, thanks.

已提问 2 年前348 查看次数
2 回答
1
已接受的回答

Before choosing how you'll transition this, Take a look at S3 Storage Lens: https://docs.aws.amazon.com/AmazonS3/latest/userguide/storage_lens_basics_metrics_recommendations.html

This will better help you to understand what your usage is like so you don't transition a bunch of objects and then have to immediate incur costs to retrieve them. Once you have a good understanding of usage pattern on buckets, then I'd implement lifecycle policies. https://docs.aws.amazon.com/AmazonS3/latest/userguide/object-lifecycle-mgmt.html

profile pictureAWS
专家
Rob_H
已回答 2 年前
1

Leverage a Lambda with scheduled Cron to run the S3 Delete API for the objects. Also, follow the best practice documents for S3 with respect to Costs as below and maybe that can reduce the cost. Your use-case & data is best known by you. You can enable lifecycle rules/Intelligent Tiering approach if you feel the data in the Bucket is for the same.

Best Practices- https://docs.aws.amazon.com/AmazonS3/latest/userguide/optimizing-performance.html

Cost optimization- https://aws.amazon.com/s3/cost-optimization/

A lot can be done to reduce the costs and optimize the bucket, however it is best to also reach out to AWS Technical account manager/AWS Support who can closely work with you on saving the costs.

profile pictureAWS
支持工程师
已回答 2 年前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则