What is best way to zip large amount of objects from aws s3 (like Tbs of files)

0

What is best way to zip large amount of objects from aws s3 (like Tbs of files) and upload that zip file to another bucket in the same region to another storage class like Glacier?

Agan
質問済み 2年前7133ビュー
1回答
0

The best way to copy large numbers of objects to other buckets seems to be S3 Batch Operations. [1] [2]

[1] Performing large-scale batch operations on Amazon S3 objects - Amazon Simple Storage Service
https://docs.aws.amazon.com/AmazonS3/latest/userguide/batch-ops.html

[2] Copy objects - Amazon Simple Storage Service
https://docs.aws.amazon.com/AmazonS3/latest/userguide/batch-ops-copy-object.html

However, a different approach seems to be needed for how to zip existing objects. [3] [4]

[3] python - possible to copy s3 file and change its zip format while doing so?
https://stackoverflow.com/questions/67355516/possible-to-copy-s3-file-and-change-its-zip-format-while-doing-so

[4] amazon web services - Zip up folders as I upload them to AWS Glacier - Stack Overflow
https://stackoverflow.com/questions/57108755/zip-up-folders-as-i-upload-them-to-aws-glacier

Hope one of the above fits your requirements.

profile picture
mn87
回答済み 2年前
  • You might want to compare speed/time of various compression algorithms that might be available on your OS: bzip2, xz, gzip, etc. For larger files bzip2 might be better than xz.

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ