AWS lambda not triggering for all files upload to s3 bucket for Puppeteer based pdf generation in node

0

I have a pdf generation service using Puppeteer in AWS lambda, I have two s3 buckets, one will trigger the lambda when a JSON is uploaded into it and the resulting pdf is put into another s3 bucket.

S3(trigger) ---> Lambda ---> S3(result)

when uploading a small number of files (less than 100 or 50), it's working well, all files from the s3 trigger will get converted to pdf.

But when say 1000 files are uploaded, some results from the s3 bucket are missing, this will happen for less than 500 also. These missing files occur in a random fashion. Each request takes a maximum of 40 seconds to convert one JSON file to a pdf file

asked 7 months ago224 views
1 Answer
0

Hello.

AWS Lambda has concurrency limits, both for the total number of concurrent executions and also for a given function. If you rapidly upload many files to S3, this could trigger a large number of concurrent Lambda executions, which may hit the concurrency limit.

Regards, Andrii

profile picture
EXPERT
answered 7 months ago
  • from what I have heard, there is a soft limit of 1,000 executions per account per region, so if I increase that, will it solve the issue? and this happens if I upload just 500 files.

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions