Layer issue with AWS Lambda

0

Hi,

I am trying to move working code from my local machine to AWS Lambda. This is a primarily a data wrangling code which generates a pdf file.

For this, I added a AWS layer AWSSDKPandas-Python38 but this consumes too much space and for other layers I start getting the message Layers consume more than the available size of 262144000 bytes. When I put pandas under a custom layer I get below error:

"errorMessage": "Unable to import module 'lambda_function': C extension: No module named 'pandas._libs.interval' not built. If you want to import pandas from the source directory, you may need to run 'python setup.py build_ext --force' to build the C extensions first.", "errorType": "Runtime.ImportModuleError", "stackTrace": []

Also, I have reached after similar issues with numpy for which I resorted to AWS layer: AWSLambda-Python38-SciPy1x. There used to be a AWS layer for data wrangling which worked earlier for most of my deployments but cant find it now.

Any pointers on how to fix this would be great.

Regards, dbeings

  • did your lambda code package added python Pandas module correctly, This error get when your lambda python code package don't have external python module libraries, you need to build customs package and import it or use aws code build service to install python module and package it to deploy to your lambda service to make sure all python modules are in lambda code source to avoid package

dbeing
asked a year ago1158 views
4 Answers
0

Please follow below steps to fix this issue Refer this article for steps to fix it https://www.linkedin.com/pulse/how-use-custom-python-modules-aws-lambda-sumit-potdar High-level steps:

  1. Create an Amazon Linux instance and check the Python version on this Ec2 instance. [ Optional — Upgrade or degrade to specific Python version based on your need ]
  2. Download necessary packages (pandas for our use-case) and create a zip
  3. Upload the zip to the S3 bucket
  4. Create Lambda function (Choose the same Python version used on Amazon Linux instance)
  5. Apply the layer to the Lambda function
  6. Import the Pandas library in the code and do a test run.

Demonstration — Step 1: Create an Amazon Linux instance and log in to the EC2 instance to run the below commands.

mkdir Demo cd Demo Install the necessary packages in the Python folder using the below command. pip3 install pandas -t . rm -rf *.dist-info

Step 2: Move to the parent directory and zip the Python directory. (Note: Zip should have a Python directory). zip -r pandas_layer.zip .

Step 3: Create an S3 bucket (if not already available) and upload the zip file created in Step 2. (Note: For our use-case, we have created an S3 bucket named python-demo-pandas where we would be uploading this zip file) aws s3 cp pandas_layer.zip s3:// python-demo-pandas/

Step 4: Create lambda function from AWS Console and select the correct Python version which was also used in the Amazon Linux instance. This is because sometimes modules are not compatible and you might face issues while using these modules in Lambda function. Step 5: Create layers for Lambda from AWS Console

Go to the Lambda function to which the layer has to be applied. Click on layers which will redirect to the section where you could add layers. Click on the ‘Add layer’ button to add layers Step 6 : Import the pandas library in code and do test run to see if it is working fine Test Execution is successful, which indicates the layer has been applied correctly. Good job, you did it!

answered a year ago
  • Tried getting pandas from E2 now but getting the message "Layers consume more than the available size of 262144000 bytes". I have just 4 layers as below as of now: googleAPIPythonClient, reportlab, PIL, AWSLambda-Python38-SciPy1x. My custom layers dont add more than 70 MB. It appears I cant use AWSLambda-Python38-SciPy1x along with pandas that I got from EC2 as it crosses the size limit. When I remove the AWS SciPy1x layer for numpy and use EC2 built numpy and python packages then I get another error with numpy (error below).

    Any way that I can use the old AWSDataWrangler-Python39. Basis earlier deployment, I feel it took care of both Pandas and Numpy libraries.

    Numpy error: { "errorMessage": "Unable to import module 'lambda_function': Unable to import required dependencies:\nnumpy: \n\nIMPORTANT: PLEASE READ THIS FOR ADVICE ON HOW TO SOLVE THIS ISSUE!\n\nImporting the numpy C-extensions failed. This error can happen for\nmany reasons, often due to issues with your setup or how NumPy was\ninstalled.\n\nWe have compiled some common reasons and troubleshooting tips at:\n\n https://numpy.org/devdocs/user/troubleshooting-importerror.html\n\nPlease note and check the following:\n\n * The Python version is: Python3.8 from "/var/lang/bin/python3.8"\n * The NumPy version is: "1.24.3"\n\nand make sure that they are the versions you expect.\nPlease carefully study the documentation linked above for further help.\n\nOriginal error was: No module named 'numpy.core._multiar

0

did your lambda code package added python Pandas module correctly, This error get when your lambda python code package don't have external python module libraries, you need to build customs package and import it or use aws code build service to install python module and package it to deploy to your lambda service to make sure all python modules are in lambda code source to avoid package Refer this artical https://www.linkedin.com/pulse/how-use-custom-python-modules-aws-lambda-sumit-potdar

answered a year ago
0

Hi,

if layers are getting too much spaces, an alternative is to run Lambda as container https://docs.aws.amazon.com/lambda/latest/dg/images-create.html.

Hope it helps ;)

profile picture
EXPERT
answered a year ago
0

Here is another easy alternative that worked well for me: You could find an ARN for your Python version and library from here: https://github.com/keithrozario/Klayers

And add a Lambda layer using ARN.

Himal
answered 10 months ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions