Lambda Function execution one by one


We have a Lambda function that got executed on the same timeframe which is causing issues for us.

Log events for the 2 Lambda executions: 2022/09/22/[$LATEST]5f270094955a474787c86b0ce60 **2022-09-22 21:17:52 **(UTC+02:00) 2022/09/22/[$LATEST]6f1c42504e6fa3d5f53f2158c592 2022-09-22 21:28:22 (UTC+02:00)

we notice that the first Lambda function was still in process -> from 2022-09-22T21:17:46.882+02:00 till 2022-09-22T21:17:52.665+02:00 while the second Lambda function was also in execution ->from 2022-09-22T21:17:47.420+02:00 till 2022-09-22T21:28:22.129+02:00

Current process is: API Gateway -> Lambda Function

The API Gateway (trigger) passes a JSON file to the Lambda function.

We are looking for a solution to process the Lambda Function one by one. If a Lambda function is in execution then the next Lambda function needs to wait until the first Lambda function is finished processing.

Thanks for your input ...

2 Answers

The easiest way to serialize lambda executions (prevent parallel execution) is to put a queue between your API Gateway and Lambda. If you use SQS, you'll need to implement FIFO queues ( or SNS FIFO topics. Additionally, you will need to set the reserve concurrency of the function to 1. However, you will also need to handle 429 errors when you function is throttled by using a dead-letter queue and reprocessing logic.

However, you should also think about your overall architecture. One of the main benefits of Lambdas are their ability to scale to handle parallel execution. In order for this to work, you need to observe some basic design tenants including idempotency. If serialized execution of lambda function is really what you are after, I would argue that lambdas is probably not the right choice for compute. Take a look at this blog for more details.

answered 2 months ago

When you configure API Gateway to invoke a Lambda function, every time you invoke the API, it will invoke your function. If you have some reasons that do not allow you to run the function in parallel, you have two main options:

  1. Return an error to the client when it invokes the function while it is already running. Let the client retry. You can do that by setting a concurrency limit of 1 on the function. (API Gateway -> Lambda)
  2. Place the message into a queue and let Lambda process the messages from the queue. In this case the client will not get an error, but you still need to limit the concurrency to 1. (API Gateway -> SQS -> Lambda)

If you do not want the client to retry and you don'n need the answer from the function, you can go with the second option. Otherwise, you can go with the first one.

profile picture
answered 2 months ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions