1 Answer
- Newest
- Most votes
- Most comments
0
Hello.
You cannot directly upload files to EC2 with the AWS SDK.
One way is to use an S3 event trigger to run Lambda and have it execute a Systems Manager RunCommand.
The Lambda code looks like this:
from datetime import datetime
import boto3
import os
ssm = boto3.client('ssm')
instance_id = os.environ['instance_id']
now = datetime.now()
file_name = 'file_name' + now.strftime('%Y%m%d%H%M%S')
def lambda_handler(event, context):
bucket = event['Records'][0]['s3']['bucket']['name']
key = event['Records'][0]['s3']['object']['key']
response = ssm.send_command(
InstanceIds=[instance_id],
DocumentName='AWS-RunShellScript',
Parameters={
'commands': [
f'aws s3 cp s3://{bucket}/{key} /home/ec2-user/{file_name}'
],
'executionTimeout': ['3600'],
}
)
Relevant content
- Accepted Answerasked 2 years ago
- asked 6 months ago
- Accepted Answerasked 5 months ago
- AWS OFFICIALUpdated 9 months ago
- AWS OFFICIALUpdated 17 days ago
- AWS OFFICIALUpdated a year ago
Okay, i see. As a follow up question, what would be the best way to upload a 20 GB file to an EC2 instance without using AWS SDK and S3?
How about uploading with SCP command? If you really want to upload to EC2 with Python, you can use a module called "paramiko" to operate SFTP from Python code. https://www.paramiko.org/