- Newest
- Most votes
- Most comments
The best way to copy large numbers of objects to other buckets seems to be S3 Batch Operations. [1] [2]
[1] Performing large-scale batch operations on Amazon S3 objects - Amazon Simple Storage Service
https://docs.aws.amazon.com/AmazonS3/latest/userguide/batch-ops.html
[2] Copy objects - Amazon Simple Storage Service
https://docs.aws.amazon.com/AmazonS3/latest/userguide/batch-ops-copy-object.html
However, a different approach seems to be needed for how to zip existing objects. [3] [4]
[3] python - possible to copy s3 file and change its zip format while doing so?
https://stackoverflow.com/questions/67355516/possible-to-copy-s3-file-and-change-its-zip-format-while-doing-so
[4] amazon web services - Zip up folders as I upload them to AWS Glacier - Stack Overflow
https://stackoverflow.com/questions/57108755/zip-up-folders-as-i-upload-them-to-aws-glacier
Hope one of the above fits your requirements.
Relevant content
- asked a year ago
- AWS OFFICIALUpdated a year ago
- AWS OFFICIALUpdated 2 months ago
- AWS OFFICIALUpdated 2 months ago
- AWS OFFICIALUpdated 2 years ago
You might want to compare speed/time of various compression algorithms that might be available on your OS: bzip2, xz, gzip, etc. For larger files bzip2 might be better than xz.