2 Respostas
- Mais recentes
- Mais votos
- Mais comentários
3
- Create a role for EC2 with a policy including PutObject to the objects in the bucket. The policy looks like this:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Stmt1647624289157",
"Action": [
"s3:PutObject"
],
"Effect": "Allow",
"Resource": "arn:aws:s3:::mybucket/*"
}
]
}
- Assign the role to the instance.
- On the instance, copy the files:
aws s3 cp . s3://mybucket --recursive
- Remove role from instance & delete the role and policy
0
Assuming you have the data in an EBS, you could do this via DataSync.
Since it is currently not possible to transfer via DataSync from an EBS you need an NFS server to share the data directories with a DataSync agent. You can install an NFS server on your EC2 instance or create a snapshot of it, then create a volume from the snapshot and attach it to an instance that will act as an NFS server.
Once you have the NFS, follow the necessary steps in DataSync:
- create an agent.
- create a source location which is the NFS and a destination location which is your S3 bucket.
- create a copy task and launch it.
respondido há 2 anos
Conteúdo relevante
- AWS OFICIALAtualizada há 2 anos
- AWS OFICIALAtualizada há 2 meses