I would like to be able to add many (hundreds of thousands) of small s3 files to a single archive elsewhere on s3. It doesn't necessarily need to be fast, but it does need to be reliable. I can stream data through an archiver and back to s3 in a single lambda on a small scale, but since I need to get every single object, at full scale it's a lot to ask from a single lambda.
Could I, for instance, use step function to run archiving lambdas against a subset of the files and perform multipart uploads into a single combined archive?
Are there any better ways to achieve this sort of thing?