2 Risposte
- Più recenti
- Maggior numero di voti
- Maggior numero di commenti
0
The AWS SDK for Python provides a pair of methods to upload a file to an S3 bucket. Here's the Boto3 documentation on using those two methods.
If instead of using python, you want just a quick one liner using the AWS CLI, here's more info on that. As an example, it'd look something like this:
aws s3 cp filename.txt s3://bucket-name
Lastly, don't forget you can just upload directly to S3 via the AWS Console without writing any code. Steps are listed out here.
All three methods still require correct IAM permissions to upload to an S3 bucket, so ensure those exist too! Hope this helps!
con risposta 2 anni fa
0
You can do this with a Lambda function to unzip the file once it transfers. Here is an example: s3-uncompressor.
Contenuto pertinente
- AWS UFFICIALEAggiornata 2 anni fa
- AWS UFFICIALEAggiornata 6 mesi fa
- AWS UFFICIALEAggiornata un anno fa
I can't upload the files in .Zip format to the S3 bucket is the catch. They must be transferred in .Zip format, unzipped, then transferred to the S3 bucket.
Then the S3-uncompressor @kentrad mentioned would likely be the best option. For large scale data migrations there's also DataSync, but that sounds like it may be overkill. One of the key features is compression during transfer so you could leave the data unzipped and DataSync could still quickly transfer it to S3. That being said an agent is required in the form of a virtual machine that actually migrates the data.