Skip to content

Why is my Lambda concurrency limited to 30 instead of 1000

0

I trigger my Lambda function for 10,000 times by uploading images to S3.

    for each in imagelist:
        NAME_FOR_S3=+str(each[0])+'_'+str(each[1])+'.png'
        LOCAL_FILE=mypath+NAME_FOR_S3
        counter=counter+1

Lambda takes about 20sec to analyses each image. I have reserved concurrency of 990 for my Lambda function. However, during the actual run, the "total concurrent executions" is about ~30 at most. (Concurrency Data )Invocation data

How to make the "total concurrent executions" bigger so my computation runs faster? Thanks

asked 2 years ago414 views
2 Answers
1

Hi,

To really test the available concurrency, I would try to eliminate the time of the upload itself. So, to achieve that, I would first upload my images into a first bucket and then copy from first bucket to a second one where the trigger is defined. And btw, I would use S3 sync command across the 2 buckets to be faster than individual copies of files to give maximum chance to concurrency to increase.

In your current setup, the time to upload the image may be the factor limiting the concurrency.

Best,

Didier

EXPERT
answered 2 years ago
  • Indeed, this is a good and easy way to pinpoint the bottleneck. I also agree that the rate at which files are arriving in the bucket being the limiting factor would be the most obvious explanation.

0

Approximately how long does it take to upload all the 10,000 files to S3?

EXPERT
answered 2 years ago
  • With my image analysis code (which takes 20sec to run for each image), 2 hours Without that code (so each Lambda invocation takes only about 30ms.) 1.35 hours

  • Invocations without the analysis code is ~360 compared to about 220 with the code

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.