What is best way to zip large amount of objects from aws s3 (like Tbs of files)

0

What is best way to zip large amount of objects from aws s3 (like Tbs of files) and upload that zip file to another bucket in the same region to another storage class like Glacier?

Agan
已提问 2 年前7133 查看次数
1 回答
0

The best way to copy large numbers of objects to other buckets seems to be S3 Batch Operations. [1] [2]

[1] Performing large-scale batch operations on Amazon S3 objects - Amazon Simple Storage Service
https://docs.aws.amazon.com/AmazonS3/latest/userguide/batch-ops.html

[2] Copy objects - Amazon Simple Storage Service
https://docs.aws.amazon.com/AmazonS3/latest/userguide/batch-ops-copy-object.html

However, a different approach seems to be needed for how to zip existing objects. [3] [4]

[3] python - possible to copy s3 file and change its zip format while doing so?
https://stackoverflow.com/questions/67355516/possible-to-copy-s3-file-and-change-its-zip-format-while-doing-so

[4] amazon web services - Zip up folders as I upload them to AWS Glacier - Stack Overflow
https://stackoverflow.com/questions/57108755/zip-up-folders-as-i-upload-them-to-aws-glacier

Hope one of the above fits your requirements.

profile picture
mn87
已回答 2 年前
  • You might want to compare speed/time of various compression algorithms that might be available on your OS: bzip2, xz, gzip, etc. For larger files bzip2 might be better than xz.

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则