AWS lambda not triggering for all files upload to s3 bucket for Puppeteer based pdf generation in node

0

I have a pdf generation service using Puppeteer in AWS lambda, I have two s3 buckets, one will trigger the lambda when a JSON is uploaded into it and the resulting pdf is put into another s3 bucket.

S3(trigger) ---> Lambda ---> S3(result)

when uploading a small number of files (less than 100 or 50), it's working well, all files from the s3 trigger will get converted to pdf.

But when say 1000 files are uploaded, some results from the s3 bucket are missing, this will happen for less than 500 also. These missing files occur in a random fashion. Each request takes a maximum of 40 seconds to convert one JSON file to a pdf file

已提問 7 個月前檢視次數 233 次
1 個回答
0

Hello.

AWS Lambda has concurrency limits, both for the total number of concurrent executions and also for a given function. If you rapidly upload many files to S3, this could trigger a large number of concurrent Lambda executions, which may hit the concurrency limit.

Regards, Andrii

profile picture
專家
已回答 7 個月前
  • from what I have heard, there is a soft limit of 1,000 executions per account per region, so if I increase that, will it solve the issue? and this happens if I upload just 500 files.

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南