2 réponses
- Le plus récent
- Le plus de votes
- La plupart des commentaires
0
The AWS SDK for Python provides a pair of methods to upload a file to an S3 bucket. Here's the Boto3 documentation on using those two methods.
If instead of using python, you want just a quick one liner using the AWS CLI, here's more info on that. As an example, it'd look something like this:
aws s3 cp filename.txt s3://bucket-name
Lastly, don't forget you can just upload directly to S3 via the AWS Console without writing any code. Steps are listed out here.
All three methods still require correct IAM permissions to upload to an S3 bucket, so ensure those exist too! Hope this helps!
répondu il y a 2 ans
0
You can do this with a Lambda function to unzip the file once it transfers. Here is an example: s3-uncompressor.
Contenus pertinents
- demandé il y a 6 mois
- demandé il y a 3 mois
- demandé il y a un mois
- AWS OFFICIELA mis à jour il y a un an
- AWS OFFICIELA mis à jour il y a un an
- AWS OFFICIELA mis à jour il y a un an
I can't upload the files in .Zip format to the S3 bucket is the catch. They must be transferred in .Zip format, unzipped, then transferred to the S3 bucket.
Then the S3-uncompressor @kentrad mentioned would likely be the best option. For large scale data migrations there's also DataSync, but that sounds like it may be overkill. One of the key features is compression during transfer so you could leave the data unzipped and DataSync could still quickly transfer it to S3. That being said an agent is required in the form of a virtual machine that actually migrates the data.