What is best way to zip large amount of objects from aws s3 (like Tbs of files)

0

What is best way to zip large amount of objects from aws s3 (like Tbs of files) and upload that zip file to another bucket in the same region to another storage class like Glacier?

Agan
已提問 2 年前檢視次數 7235 次
1 個回答
0

The best way to copy large numbers of objects to other buckets seems to be S3 Batch Operations. [1] [2]

[1] Performing large-scale batch operations on Amazon S3 objects - Amazon Simple Storage Service
https://docs.aws.amazon.com/AmazonS3/latest/userguide/batch-ops.html

[2] Copy objects - Amazon Simple Storage Service
https://docs.aws.amazon.com/AmazonS3/latest/userguide/batch-ops-copy-object.html

However, a different approach seems to be needed for how to zip existing objects. [3] [4]

[3] python - possible to copy s3 file and change its zip format while doing so?
https://stackoverflow.com/questions/67355516/possible-to-copy-s3-file-and-change-its-zip-format-while-doing-so

[4] amazon web services - Zip up folders as I upload them to AWS Glacier - Stack Overflow
https://stackoverflow.com/questions/57108755/zip-up-folders-as-i-upload-them-to-aws-glacier

Hope one of the above fits your requirements.

profile picture
mn87
已回答 2 年前
  • You might want to compare speed/time of various compression algorithms that might be available on your OS: bzip2, xz, gzip, etc. For larger files bzip2 might be better than xz.

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南