How to efficiently perform streaming of large zip files?

0

We have a use case where in we have to bulk download data of around 50Gbs from S3 to users' local. We have planned of using a ECS to pick up files till we fetch around 10Gbs from the source S3 and zip them and then upload that zip to another S3 bucket.We would have to perform this operation multiple times till we generate all the zips.

Is there a way by which we can generate the zip of the whole data in one go? Also how do we then stream this large zip file from our destination S3 to users' local?

已提问 1 年前481 查看次数
2 回答
0

If you create an EC2 instance with enough memory then it should be possible to copy the files onto the instance and compress them into a single file. However, if speed is the goal then parallelizing the compression of sets of the files would probably be faster and your ECS approach (perhaps with smaller chunks and more containers) would work well.

If this is an ongoing process then perhaps a Lambda function could be used to compress all new files and transfer them directly?

AWS
Alex_K
已回答 1 年前
  • Also to fetch the zip to clients' local i believe we can use S3 transfer manager of AWS SDK. But any idea around how much data can be transferred in a go using transfer manager?

0

Hi,

For your question: Is there a way by which we can generate the zip of the whole data in one go? Currently there is no S3 provided functionality to do this. This must be handled via individually the objects from S3 and creating a ZIP archive. If you want to do it within the AWS Cloud, as an example, you could use Lambda (if within the timeout) or ECS or EC2 or AWS Batch

For your question: How do we then stream this large zip file from our destination S3 to users' local? It is possible to download large files from Amazon S3 using a browser by using the AWS SDK Please refer to these articles for understanding/examples: https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/s3-browser-examples.html https://docs.aws.amazon.com/AmazonS3/latest/userguide/example_s3_Scenario_UsingLargeFiles_section.html

Thanks

profile pictureAWS
Rama
已回答 1 年前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则