Enabling lifecycle rules on a huge S3 bucket

0

Hello all, I have a huge S3 bucket with almost 7 Billion objects and 270 Tb of data. This bucket receives almost 2 million objects per day, and the object size has an average of 37 kb. Enabling Intelligent tiering here is a no-go because it's really expensive scanning objects. I'm thinking in enabling Infrequent Access but I don't know if doing this could lead to high costs just for enabling it on this size of a bucket. Any advice on this would be helpful, thanks.

2개 답변
1
수락된 답변

Before choosing how you'll transition this, Take a look at S3 Storage Lens: https://docs.aws.amazon.com/AmazonS3/latest/userguide/storage_lens_basics_metrics_recommendations.html

This will better help you to understand what your usage is like so you don't transition a bunch of objects and then have to immediate incur costs to retrieve them. Once you have a good understanding of usage pattern on buckets, then I'd implement lifecycle policies. https://docs.aws.amazon.com/AmazonS3/latest/userguide/object-lifecycle-mgmt.html

profile pictureAWS
전문가
Rob_H
답변함 2년 전
1

Leverage a Lambda with scheduled Cron to run the S3 Delete API for the objects. Also, follow the best practice documents for S3 with respect to Costs as below and maybe that can reduce the cost. Your use-case & data is best known by you. You can enable lifecycle rules/Intelligent Tiering approach if you feel the data in the Bucket is for the same.

Best Practices- https://docs.aws.amazon.com/AmazonS3/latest/userguide/optimizing-performance.html

Cost optimization- https://aws.amazon.com/s3/cost-optimization/

A lot can be done to reduce the costs and optimize the bucket, however it is best to also reach out to AWS Technical account manager/AWS Support who can closely work with you on saving the costs.

profile pictureAWS
지원 엔지니어
답변함 2년 전

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠