I have a pdf generation service using Puppeteer in AWS lambda, I have two s3 buckets, one will trigger the lambda when a JSON is uploaded into it and the resulting pdf is put into another s3 bucket.
S3(trigger) ---> Lambda ---> S3(result)
when uploading a small number of files (less than 100 or 50), it's working well, all files from the s3 trigger will get converted to pdf.
But when say 1000 files are uploaded, some results from the s3 bucket are missing, this will happen for less than 500 also. These missing files occur in a random fashion. Each request takes a maximum of 40 seconds to convert one JSON file to a pdf file
from what I have heard, there is a soft limit of 1,000 executions per account per region, so if I increase that, will it solve the issue? and this happens if I upload just 500 files.