- 新しい順
- 投票が多い順
- コメントが多い順
Hi - Some steps could be
- Read the zip file from S3 using the Boto3 S3 resource Object
- Open the object using a module which supports working with tar or zip.
- Iterate over each file in the zip file using any available list method
- Write the file back to another bucket in S3
The suggestion by @Nitin above would certainly work, if preserving the directory tree within the ZIP file is important you may want to look at mounting the S3 bucket onto the Linux host itself.
The officially supported way would be S3 File Gateway https://aws.amazon.com/blogs/storage/mounting-amazon-s3-to-an-amazon-ec2-instance-using-a-private-connection-to-s3-file-gateway/ but that's expensive, and probably not worth it for a one-off demonstration.
There is also s3fs https://github.com/s3fs-fuse/s3fs-fuse which will do much the same, although I find it rather slow if it's just for a one-off demonstration you can probably live with it. The README.md
of that Github project shows where it's available from, and how to install it.
There's also a very new offering called Mountpoint for S3 https://aws.amazon.com/blogs/storage/the-inside-story-on-mountpoint-for-amazon-s3-a-high-performance-open-source-file-client/ which I've not used myself yet, but on a quick reading of that blog it may be also achieve what you want.
関連するコンテンツ
- AWS公式更新しました 2年前