How come I have to use S3 to publish lambda layers when they are large? Where are the docs on this?

0

I get a notice that unzipped size must be smaller than X bytes when I try to deploy with sam or even aws lambda publish-layer-version

However, if I upload my layers to S3 first, then I can deploy just fine with ContentUri. I stumbled on this from an internet post but I cant find AWS docs that discuss this. Can anybody point me in the right direction?

2 Answers
2

To clarify how these limits work and why using Amazon S3 can help in certain scenarios.

Direct Deployment Limits:

When deploying your Lambda function directly using tools like the AWS SAM CLI, AWS CLI, or the AWS Management Console, the following size restrictions apply:

  • Zipped package size (direct upload): Maximum of 50 MB
  • Unzipped package size: Maximum of 250 MB

These limits are enforced during the deployment process. If your zipped package exceeds 50 MB, or if the unzipped contents exceed 250 MB, the deployment will fail with an error indicating that the unzipped size must be smaller than 250 MB.

S3-Based Deployment:

When you upload your deployment package to Amazon S3 and reference it in your deployment configuration, using the ContentUri property in your SAM template, you can work with larger zipped files. Here's how it works:

  • Zipped package size (uploaded to S3): Can be larger than 50 MB
  • Unzipped package size: Still must not exceed 250 MB

By uploading your package to S3, you bypass the 50 MB limit imposed on direct uploads. However, the unzipped size limit of 250 MB still applies. This means that while you can deploy larger zipped packages via S3, the total size of the uncompressed contents must remain within the 250 MB limit.

Using S3 for deployment is beneficial when your zipped package exceeds the 50 MB direct upload limit but the unzipped contents are within the 250 MB threshold. This approach allows you to deploy larger packages that would otherwise be rejected during direct uploads.

For more detailed information, you can refer to the AWS Lambda documentation on deployment package size limits https://docs.aws.amazon.com/lambda/latest/dg/gettingstarted-limits.html

AWS
answered a month ago
  • @William_O ,

    So If I try to deploy layers-1 and layers-2 at the same time, both of those will count toward the same 50mb limit?

0

The reason you need to use S3 for publishing large Lambda layers is due to size limitations imposed by AWS Lambda for direct uploads. When you try to upload a deployment package or layer archive directly to Lambda, there's a size limit of 50 MB for the ZIP file. This limitation applies when using the Lambda console, CLI, or API calls like aws lambda publish-layer-version.

However, Lambda allows you to work with larger files by storing them in Amazon S3 first. When you use the S3 approach with the ContentUri parameter, you can deploy layers that are significantly larger, up to 250 MB unzipped.

The error message you're seeing about "unzipped size must be smaller than X bytes" is related to these size restrictions. It's important to note that even when using S3, there's still a limit of 250 MB for the unzipped contents of your layer.

If you need to work with even larger dependencies, there are a few alternatives you might consider:

  1. Use multiple Lambda layers to split your dependencies (up to 5 layers can be attached to a single function).
  2. Utilize Lambda container images, which support up to 10 GB.
  3. Consider using Amazon EFS (Elastic File System) for storing larger dependencies that can be accessed by your Lambda function.

While it may not be explicitly stated in all AWS documentation, this S3 workaround for larger layers is a common practice and is supported by AWS. It's designed to help manage larger deployment packages while still maintaining certain limits for operational reasons.
Sources
Troubleshoot deployment issues in Lambda - AWS Lambda
Lambda Layer not changing when taken from S3 in cf template | AWS re:Post
Lambda Package Exceeds 60MB: Solutions for Large Dependencies? | AWS re:Post
Unzipped size must be smaller than 262144000 bytes | AWS re:Post

profile picture
answered a month ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions