Lambda Package Exceeds 60MB: Solutions for Large Dependencies?

0

Hi, I'm facing an issue with my Lambda function that requires several Python packages, causing the deployment package to exceed the 60MB size limit. I tried creating a Lambda Layer for the dependencies, but the size of the layer's zip file is around 60MB, which still isn't working.

I also attempted to upload the package to an S3 bucket and reference it in my Lambda function, but this approach led to an error about exceeding limits as well.

Has anyone encountered this issue before or found a way to handle large package sizes in Lambda functions? I would appreciate any advice or alternative solutions.

Gagan
質問済み 2ヶ月前165ビュー
4回答
3

I'd recommend looking at Lambda container images where the limit to the size of your code and dependencies is 10 GB.

profile pictureAWS
エキスパート
回答済み 2ヶ月前
profile picture
エキスパート
レビュー済み 2ヶ月前
profile pictureAWS
エキスパート
レビュー済み 2ヶ月前
2

Hello.

Quotas such as package size quotas cannot currently be adjusted.
https://docs.aws.amazon.com/lambda/latest/dg/gettingstarted-limits.html

Therefore, if you want to use a large package with Lambda, you can avoid it by using Lambda with a container.
This can support up to 10GB.
https://docs.aws.amazon.com/lambda/latest/dg/images-create.html#images-types

Lambda supports a maximum uncompressed image size of 10 GB, including all layers.

profile picture
エキスパート
回答済み 2ヶ月前
profile picture
エキスパート
レビュー済み 2ヶ月前
profile pictureAWS
エキスパート
レビュー済み 2ヶ月前
1
承認された回答

1. Optimize Your Dependencies

Exclude Unnecessary Packages: Review your requirements.txt or equivalent and remove any unnecessary packages or dependencies. Use Lambda-Compatible Packages: Check if smaller, Lambda-compatible versions of packages are available.

Custom Build: Rebuild some packages from source, removing unnecessary components or dependencies to reduce size.

2. Create Multiple Lambda Layers

Divide Dependencies Across Layers: Instead of one large Lambda Layer, split your dependencies into multiple layers. AWS Lambda allows up to 5 layers to be attached to a single function. You can distribute your packages across these layers to stay within the size limits.

3. Use AWS Lambda with Docker (Container Images)

Switch to Lambda Containers: AWS Lambda supports container images up to 10GB. You can package your Lambda function along with all dependencies into a Docker container image. This approach bypasses the 60MB limit and allows for larger dependencies.

Steps:

Create a Dockerfile with your function code and dependencies.

Build and test the Docker image locally.

Push the image to Amazon Elastic Container Registry (ECR).

Create or update the Lambda function to use the container image from ECR.

Example Dockerfile:

FROM public.ecr.aws/lambda/python:3.9

# Install dependencies
COPY requirements.txt .
RUN pip install -r requirements.txt

# Copy function code
COPY app.py .

# Command to run your Lambda function
CMD ["app.lambda_handler"]

Building and Deploying:

docker build -t my-lambda-function .

docker tag my-lambda-function:latest <your-ecr-repo>:latest

docker push <your-ecr-repo>:latest

4. Load Dependencies Dynamically

Use S3 for Large Packages: Store your large packages in S3 and download them dynamically within your Lambda function at runtime. While this increases cold start time, it can help you stay within the deployment package limits.

**Download and Extract: ** Download and extract dependencies to the /tmp directory in the Lambda function, which provides 512MB of space.

import boto3
import zipfile
import os

def lambda_handler(event, context):
    s3 = boto3.client('s3')
    s3.download_file('my-bucket', 'large-package.zip', '/tmp/large-package.zip')

    with zipfile.ZipFile('/tmp/large-package.zip', 'r') as zip_ref:
        zip_ref.extractall('/tmp/dependencies')

    sys.path.insert(0, '/tmp/dependencies')

    # Your function code

5. Refactor Code to Microservices

Break Down Functionality: If possible, break down your Lambda function into smaller microservices. Each smaller function can handle specific tasks, reducing the size of the deployment package.

6. Use AWS Step Functions

**Orchestrate with Step Functions: **If your large package is due to multiple distinct tasks, consider using AWS Step Functions to orchestrate multiple smaller Lambda functions. This way, each Lambda function can remain within the size limits.

7. Explore Third-Party Solutions

Serverless Framework: Some tools, like the Serverless Framework, offer plugins to help manage large packages and dependencies more effectively.

Summary:

Optimize dependencies to reduce the package size.

Use multiple layers or Docker containers for larger packages.

Dynamically load dependencies from S3 if necessary.

Break down functionality into smaller Lambda functions or microservices.

エキスパート
回答済み 2ヶ月前
profile picture
エキスパート
レビュー済み 1ヶ月前
  • A lot of good suggestions. But keep in mind:

    • The suggestions to reduce the size by any means possible is best if possible as image size impacts cold start times
    • Layers doesn't help increase the total size available fpr the lambda image
    • Option 4. Load Dependencies Dynamically is cool, but should try to do it outside of the handler so that it is done during init time, not everytime you run the lambda
      • If the dependencies are REALLY huge it could take longer than the 10 second init time window
    • If you just want to make it work and not think about dependencies (and the image size is less than 10GB!), the Docker container approach is the easiest.
0

Another option is to put the dependencies in EFS, mount the directory with dependencies on the Lambda and update the PYTHONPATH to include the mounted directory.

Just watch out for cost. EFS is great but storage is surprisingly expensive.

profile picture
回答済み 1ヶ月前

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ