Lambda function working locally but crashing on AWS

0

I deploy my Lambda function code as a container image. I create a simple Python image from an alternative base image which is the Fenics Project stable image. When the dolfin module is imported the following error message is displayed before crashing :

terminate called after throwing an instance of 'std::logic_error'
what(): basic_string::_M_construct null not valid
Runtime exited with error: signal: aborted (core dumped)
Runtime.ExitError

When I test my function locally with the runtime interface emulator and the same image, I don’t have any error messages. After some research, I have found out that the problem came from the python extension (.so) named « cpp » present into dolphin package but I don’t understand why it works locally and not on AWS.

Here are my files :

Dockerfile

ARG FUNCTION_DIR="/function"

FROM quay.io/fenicsproject/stable:current as build-image

# Include global arg in this stage of the build
ARG FUNCTION_DIR

# Install aws-lambda-cpp build dependencies
RUN sudo apt-get update -y && \
  sudo DEBIAN_FRONTEND=noninteractive apt-get install -y \
  g++ \
  make \
  cmake \
  unzip \
  libcurl4-openssl-dev

# Create function directory
RUN sudo mkdir -p ${FUNCTION_DIR}

# Copy function code
COPY /aws_documents/app ${FUNCTION_DIR}

RUN sudo pip install --upgrade pip && \
    sudo pip install \
        --target ${FUNCTION_DIR} \
        awslambdaric

# Multi-stage build: grab a fresh copy of the base image (to keep the image light)

FROM quay.io/fenicsproject/stable:current

# Include global arg in this stage of the build
ARG FUNCTION_DIR

# Set working directory to function root directory
WORKDIR ${FUNCTION_DIR}

# Copy in the build image dependencies
RUN sudo pip install --upgrade pip
COPY --from=build-image ${FUNCTION_DIR} ${FUNCTION_DIR}

# Define property ENTRYPOINT to call runtime client interface
ENTRYPOINT [ "/usr/bin/python3.6", "-m", "awslambdaric" ]
CMD [ "app.handler" ]

app.py

import os
def handler(event,context):
        os.environ['XDG_CACHE_HOME'] = '/tmp/.cache'
        print("before import")
        import dolfin
        return("okay")
  • Are you deploying to x86 or ARM?

  • I am deploying to x86

asked 2 years ago3031 views
1 Answer
0

Has the dolfin package been compiled on an Amazon Linux platform? If it was created anywhere else the architecture is unlikely to match the runtime that Lambda uses (which is Amazon Linux).

It's not clear when you say "locally" what type of system that is - so I'd encourage you to test it on an EC2 instance.

profile pictureAWS
EXPERT
answered 2 years ago
  • No, the dolfin package has been compiled into a docker image which inherits from phusion/baseimage (base system : ubuntu).

    To be more accurate on the "locally" : I install the RIE on my local machine (MacOS High Sierra 10.13.6) and I run my lambda function using Docker Engine and the docker run command (I follow the last section of https://docs.aws.amazon.com/lambda/latest/dg/images-test.html)

    After your comment I tried with an EC2 instance with Amazon Linux 2 AMI (HVM) - Kernel 5.10. I installed Docker Engine on the instance and built the image according to the Dockerfile. I have no problem when I run the docker image as a container on EC2 and import the dolfin package but when I push (on ECR) and execute the same docker image with AWS Lambda I get the same std::string error message.

  • I've seen a similar thing in the far distant past with another Python package (in this case it was something to do with PDFs) which included precompiled binaries; and no matter what I did (even rebuilding from source): It would work on an instance but it wouldn't work in Lambda. I eventually ran out of time/patience/brainpower and did it another way. But: The Lambda runtime is quite restricted in what kernel calls can be made. Without diving into the source for the package it's impossible to say but my guess is that it's doing something that isn't allowed in Lambda. Very vague, sorry.

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions