- Newest
- Most votes
- Most comments
Hi,
I would suggest a different approach: implement a Lambda trigger on S3 updates. and use this Lambda to record the changes in a S3 file. So, you'll just have to record the changes in a file from this Lambda. Then, downloading this single inventory file will give you all changes at once.
This doc gives you details about S3 triggers with Lamdda: https://docs.aws.amazon.com/lambda/latest/dg/with-s3-example.html
Best,
Didier
Hello.
If you cannot change object names, I don't think you can filter using S3's "ListObjectsV2" API.
So, as an alternative, how about registering the object path and creation date in DynamoDB when the S3 object is created?
By setting an S3 event trigger, you can run Lambda when an object is created.
By using Lambda to register object information to DynamoDB, you can easily perform aggregation.
https://docs.aws.amazon.com/lambda/latest/dg/with-s3-example.html
Relevant content
- asked 3 years ago
- asked 2 years ago
- AWS OFFICIALUpdated 10 months ago
- AWS OFFICIALUpdated 8 months ago
- AWS OFFICIALUpdated 6 months ago
- AWS OFFICIALUpdated 2 years ago
- AWS OFFICIALUpdated 4 months ago
That’s doable but expensive. The idea would be to focus on S3 only. One approach you might consider is utilizing the LastModified property