AWS re:Postを使用することにより、以下に同意したことになります 利用規約

automatically send new image from s3 to ec2



I am trying to find a way to send a new uploaded image from a s3 bucket to EC2, automatically. In EC2, some image preprocessing techniques will apply to the image. Then, it inputs to a deep learning model and the result should automatically return to another s3 bucket.

I tried to do it with lambda function but I don't know how to properly add ec2 to destination in a way that the code on ec2 runs automatically when lambda function send the image.

I would appreciate it if you could give me some advices or suggest some references to learn from. My main programming language is python, but I am also familiar with JavaScript.


質問済み 14日前36ビュー

While I was writing this the other answers popped up - but adding detail on how to use SQS and auto-scaling.

  1. Set up a trigger so that a SQS message is sent when an image is uploaded to S3.
  2. Configure an auto-scaling group based on queue depth.
  3. The auto-scaling group should launch the appropriate AMI (that contains your code) on a GPU-enabled instance.
  4. Have your code retrieve the SQS message, process the image then delete the message from the SQS queue.
  5. Auto-scaling will then terminate the instance if there are no more messages in the queue.

At step 5 your code could check to see if other messages have arrived and if so, process those in a loop.

The maximum number of instances in step 2 is up to you - do you need to process in parallel? Then have more than one. Do you need to process the images one at a time? Then have a maximum of 1.

profile picture
回答済み 14日前
  • Thank you very much for your reply. I got some general idea. I'll try it.

  • Hi, I am trying to implement the method you mentioned above. Would you please share more detail for step 4. What code do you mean? Do you mean I should create a Lambda function to create and start EC2 instance...

  • I'm pretty sure Brettski-AWS means the code in Step 4 is the code set up in Step 3 that's on your EC2 instance. You set up the code either by launching an AMI with the code baked in (pre-prepared by snapshotting an instance you've set up with the code) or via bootstrap code in your Launch Template. This code could be a Python/Boto3 app that loops, performing the retrieve/process/delete steps that Brett mentioned.


Why not do all the preprocessing and the deep learning model in the Lambda function and skip EC2 entirely?

If this isn't an option for some reason - for example, if the run time is longer than the max execution limit of 15 minutes - then you could consider using Lambda to invoke an SSM Run command. The Lambda function would be invoked when an image is uploaded to the S3 bucket. The Lambda would send a run command to the EC2 instance, which could then pick up the image, perform the preprocessing and deep learning model, and return the resulting image to the other S3 bucket.

profile picture
回答済み 14日前
  • Thanks for reply. Is it possible to share a practical example or reference with me? Actually, my model need GPU so, I am not sure if it can be done in lambda function.


I would look at S3 Event Notification - When you upload a new image to S3 you can use this as the trigger to your Lambda function. Event notification can send events to:

Amazon Simple Notification Service (Amazon SNS) topics.

Amazon Simple Queue Service (Amazon SQS) queues.

AWS Lambda.

Amazon EventBridge.

More on S3 Event Notification . . .

profile picture
回答済み 14日前
  • Actually, I already can automatically trigger lambda function when image is uploaded to a S3 bucket. I am looking for a way to automatically invoke EC2 instance to take the image and apply ML model on it and send the result to another S3 bucket.


Expanding on tedrent's answer, it's pretty easy to write a python/boto3 app to run on your EC2 which will pull S3 Event Notifications off an SQS queue and get the corresponding files from S3.

回答済み 14日前
  • Thanks for your reply. Would you please share a practical example/reference with me?

ログインしていません。 ログイン 回答を投稿する。