AWS lambda not triggering for all files upload to s3 bucket for Puppeteer based pdf generation in node

0

I have a pdf generation service using Puppeteer in AWS lambda, I have two s3 buckets, one will trigger the lambda when a JSON is uploaded into it and the resulting pdf is put into another s3 bucket.

S3(trigger) ---> Lambda ---> S3(result)

when uploading a small number of files (less than 100 or 50), it's working well, all files from the s3 trigger will get converted to pdf.

But when say 1000 files are uploaded, some results from the s3 bucket are missing, this will happen for less than 500 also. These missing files occur in a random fashion. Each request takes a maximum of 40 seconds to convert one JSON file to a pdf file

gefragt vor 7 Monaten233 Aufrufe
1 Antwort
0

Hello.

AWS Lambda has concurrency limits, both for the total number of concurrent executions and also for a given function. If you rapidly upload many files to S3, this could trigger a large number of concurrent Lambda executions, which may hit the concurrency limit.

Regards, Andrii

profile picture
EXPERTE
beantwortet vor 7 Monaten
  • from what I have heard, there is a soft limit of 1,000 executions per account per region, so if I increase that, will it solve the issue? and this happens if I upload just 500 files.

Du bist nicht angemeldet. Anmelden um eine Antwort zu veröffentlichen.

Eine gute Antwort beantwortet die Frage klar, gibt konstruktives Feedback und fördert die berufliche Weiterentwicklung des Fragenstellers.

Richtlinien für die Beantwortung von Fragen