- Newest
- Most votes
- Most comments
Hi - Some steps could be
- Read the zip file from S3 using the Boto3 S3 resource Object
- Open the object using a module which supports working with tar or zip.
- Iterate over each file in the zip file using any available list method
- Write the file back to another bucket in S3
The suggestion by @Nitin above would certainly work, if preserving the directory tree within the ZIP file is important you may want to look at mounting the S3 bucket onto the Linux host itself.
The officially supported way would be S3 File Gateway https://aws.amazon.com/blogs/storage/mounting-amazon-s3-to-an-amazon-ec2-instance-using-a-private-connection-to-s3-file-gateway/ but that's expensive, and probably not worth it for a one-off demonstration.
There is also s3fs https://github.com/s3fs-fuse/s3fs-fuse which will do much the same, although I find it rather slow if it's just for a one-off demonstration you can probably live with it. The README.md
of that Github project shows where it's available from, and how to install it.
There's also a very new offering called Mountpoint for S3 https://aws.amazon.com/blogs/storage/the-inside-story-on-mountpoint-for-amazon-s3-a-high-performance-open-source-file-client/ which I've not used myself yet, but on a quick reading of that blog it may be also achieve what you want.
Relevant content
- Accepted Answerasked 2 years ago
- asked 9 months ago
- asked 4 months ago
- AWS OFFICIALUpdated 6 months ago
- AWS OFFICIALUpdated a year ago
- AWS OFFICIALUpdated 2 years ago
- AWS OFFICIALUpdated a year ago