- Newest
- Most votes
- Most comments
Hi, the Lambda Developer Guide shows that if a .zip file larger than 50 MB, you must upload the .zip file to S3, and specify its URL when you deploy the function.
If the .zip file archive is smaller than 50 MB, you can upload the .zip file archive from your local machine. If the file is larger than 50 MB, upload the file to the function from an Amazon S3 bucket.
Also note that the extracted .zip file cannot exceed 250 MB.
Hello,
You may want to look at using Lambda layers. One of the benefits of using layers is to reduce the size of your deployment packages. https://docs.aws.amazon.com/lambda/latest/dg/chapter-layers.html
Sharing a tutorial link as well which shows how to publish custom runtime with layers https://docs.aws.amazon.com/lambda/latest/dg/runtimes-walkthrough.html#runtimes-walkthrough-function
Hope this fits your use case.
Relevant content
- asked 6 months ago
- Accepted Answerasked 2 years ago
- AWS OFFICIALUpdated 3 years ago
- AWS OFFICIALUpdated a year ago
- AWS OFFICIALUpdated a year ago
Hello, and I appreciate your prompt response. I've previously attempted the solution you suggested. However, the problem persists because the Python version I've installed is approximately 400MB. Consequently, I encounter the error you described, which states, "Unzipped size must be smaller than 262144000 bytes." Given this constraint, I'm wondering if there might be an alternative approach?
Unfortunately .zip archive deployment is not feasible in this case since the 250 MB quota applies to layers and custom runtimes as well. (Note that this quota cannot be increased)
Container image size accepts up to 10 GB size, so it should be the only choice for you Lambda deployment.
can you share your use case. cuz I see that the limitation you mentioned can be neglected if you use docker image. And that you specifically mentioned that you are not using Docker Image. If you could provide me with why is that I can better help with it.