Does lambda function implement the single instance concurrency model?

1

Hi, we're investigating the way to migrate our traditional model backend apps to AWS lambda service. After searching around as well as reading the docs from lambda official website, I can tell that lambda uses a thread-safe, single instance for single request way for the concurrency model, am i right?

The question is: For a programming model like nodejs, which leverages event pool to max the ability to process the IO-intense works, can we specify the single instance with multiple requests concurrency model to mitigate the global concurrency limit and cold start problem? and i think this way could also save cost for saving the IO waiting time.

I see there are ways to use reserved/provisioned instances though,I think it could be better to support more fine-grained concurrency methods.

Thanks!

2 Answers
3
Accepted Answer

At first, YES, you're right, you should think about your lambda function as "single request = single thread-safe stateless lambda". Such paradigm of thinking force us, engineers, to detach compute from data (state), scale each of it independently, avoid shared state, side effects, and at the end – reach high level of parallelism avoiding hard-to-debug pitfals of parallel programming.

Regarding the second half of your question: the real beauty of the AWS Lambda is that it allows to move away from thinking about low-level concepts as CPUs utilization and IO-waiting time, and focus only only on the "business logic" and what exactly you want to achieve with your code. (and of course ,internally, AWS Lambda doing quite extensive under-the-hood optimization to avoid wasting of resources)

So technically you can, run own event loop inside single lambda call and handle multiple request within each lambda call. However I would call it an anti-pattern and a maybe sign, that you might not need Lambda here at all.

I would recommend simply to give it a try, and don't think too much about underneath resources optimization. Also, if I am not mistaken, the default lambda reserved concurrency is 1000, so should be enough to experiment with high traffic.

AWS
Vadym
answered 2 years ago
  • Sure, thanks for your quick reply! I'll definitely give it a try and update here when I have more thoughts.

0

Lambda functions run in run-time environments, each instance in its own environment. Each instance can handle a single request at a time. There is no way to route multiple requests to a single instance at the same time.

Saying that, there are event sources that support batching, e.g., SQS, Kinesis Data Streams, etc. For those event sources you could configure the function to be invoked for multiple events and then you can handle them within a single invocation and better utilize the CPU and reduce the cost for IO intensive workloads. If your workload can be asynchronous, consider using this method (e.g., if the request is received from API gateway don't invoke the function directly but rather send it to SQS and then invoke the function using batching.

profile pictureAWS
EXPERT
Uri
answered 2 years ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions