What is best way to zip large amount of objects from aws s3 (like Tbs of files)

0

What is best way to zip large amount of objects from aws s3 (like Tbs of files) and upload that zip file to another bucket in the same region to another storage class like Glacier?

Agan
asked 2 years ago6932 views
1 Answer
0

The best way to copy large numbers of objects to other buckets seems to be S3 Batch Operations. [1] [2]

[1] Performing large-scale batch operations on Amazon S3 objects - Amazon Simple Storage Service
https://docs.aws.amazon.com/AmazonS3/latest/userguide/batch-ops.html

[2] Copy objects - Amazon Simple Storage Service
https://docs.aws.amazon.com/AmazonS3/latest/userguide/batch-ops-copy-object.html

However, a different approach seems to be needed for how to zip existing objects. [3] [4]

[3] python - possible to copy s3 file and change its zip format while doing so?
https://stackoverflow.com/questions/67355516/possible-to-copy-s3-file-and-change-its-zip-format-while-doing-so

[4] amazon web services - Zip up folders as I upload them to AWS Glacier - Stack Overflow
https://stackoverflow.com/questions/57108755/zip-up-folders-as-i-upload-them-to-aws-glacier

Hope one of the above fits your requirements.

profile picture
mn87
answered 2 years ago
  • You might want to compare speed/time of various compression algorithms that might be available on your OS: bzip2, xz, gzip, etc. For larger files bzip2 might be better than xz.

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions