Enabling lifecycle rules on a huge S3 bucket

0

Hello all, I have a huge S3 bucket with almost 7 Billion objects and 270 Tb of data. This bucket receives almost 2 million objects per day, and the object size has an average of 37 kb. Enabling Intelligent tiering here is a no-go because it's really expensive scanning objects. I'm thinking in enabling Infrequent Access but I don't know if doing this could lead to high costs just for enabling it on this size of a bucket. Any advice on this would be helpful, thanks.

質問済み 2年前348ビュー
2回答
1
承認された回答

Before choosing how you'll transition this, Take a look at S3 Storage Lens: https://docs.aws.amazon.com/AmazonS3/latest/userguide/storage_lens_basics_metrics_recommendations.html

This will better help you to understand what your usage is like so you don't transition a bunch of objects and then have to immediate incur costs to retrieve them. Once you have a good understanding of usage pattern on buckets, then I'd implement lifecycle policies. https://docs.aws.amazon.com/AmazonS3/latest/userguide/object-lifecycle-mgmt.html

profile pictureAWS
エキスパート
Rob_H
回答済み 2年前
1

Leverage a Lambda with scheduled Cron to run the S3 Delete API for the objects. Also, follow the best practice documents for S3 with respect to Costs as below and maybe that can reduce the cost. Your use-case & data is best known by you. You can enable lifecycle rules/Intelligent Tiering approach if you feel the data in the Bucket is for the same.

Best Practices- https://docs.aws.amazon.com/AmazonS3/latest/userguide/optimizing-performance.html

Cost optimization- https://aws.amazon.com/s3/cost-optimization/

A lot can be done to reduce the costs and optimize the bucket, however it is best to also reach out to AWS Technical account manager/AWS Support who can closely work with you on saving the costs.

profile pictureAWS
サポートエンジニア
回答済み 2年前

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ