- Newest
- Most votes
- Most comments
The easiest way to achieve this is to use AWS System Manager(SSM) to manage the EC2 instance (it is free of charge) and use SSM SDK to run a command through Lamda function through sendcommand() call. You can run arbitrary shell or Powershell commands through SSM agent on EC2 instance.
So the design will look like this : S3->Event->Lambda->Call to SSM sendcommand->EC2 Instance command execution->Access to S3 from EC2.
Of course you need to make sure that security-wise these services have the proper IAM policies to achieve this.
The drawback of this solution is the lack of queue : If for whatever reason your instance cannot fetch the file, you cannot retry the scenario. If you need to ensure the delivery at all times, then the suggestion proposed above to make EC2 instance pull from a SQS queue is a better choice.
I think you will need to use SFTP from the Lambda function to the EC2 instance.
There is no service that is used to upload a file to EC2. Instead of triggering a Lambda function, send the S3 event to SQS and let the EC2 instance poll the queue. Once there is a message, the code in the instance will download the file from S3.
It might be easier if you used the aws s3 cp
command on the EC2 instance to download from S3, rather than using a Lambda function. If you wanted to know when to run that command, you could have S3 publish a notification to SNS or SQS, and some process on the EC2 that gets initiated from SNS/SQS that runs that then downloads the file from S3.
Relevant content
- asked 2 years ago
- AWS OFFICIALUpdated 2 months ago
- AWS OFFICIALUpdated a year ago
- AWS OFFICIALUpdated 3 years ago
- AWS OFFICIALUpdated 2 years ago
Please tag the answer as accepted answer if you found it provided the response to your query.