invoke_endpoint error in Lambda: StreamingBody is not JSON serializable

0

I'm writing a Lambda function that invokes an endpoint:

runtime= boto3.Session().client('runtime.sagemaker')
payload = {"data": ["McDonalds"]}
response = runtime.invoke_endpoint(EndpointName=ENDPOINT_NAME,
                                       ContentType='application/json',
                                       Body=json.dumps(payload))

It returns this error

An error occurred during JSON serialization of response: <botocore.response.StreamingBody object at 0x7f59e40acc50> is not JSON serializable

I tried this exact function in SageMaker notebook and it works but it doesn't work in Lambda. Can someone please help me?

Edited by: aurelius on Feb 19, 2019 10:38 PM

gefragt vor 5 Jahren3023 Aufrufe
5 Antworten
0

Was this because I didn't attach the necessary SageMaker policies to the IAM role? I only added this policy:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": "sagemaker:InvokeEndpoint",
            "Resource": "*"
        }
    ]
}
beantwortet vor 5 Jahren
0

Hi aurelius,
Do you have further code after the response = runtime.invoke_endpoint(..) line? the error message says problem with serialization of the response object.

Your role permission looks fine.

Thank you,
Arun

AWS
beantwortet vor 5 Jahren
0

I am facing similar issue, in sagemaker jupyter notebook instance the endpoint is invoked successfully and I am able to get back the inference results.

Below is my Lambda function that i used to invoke the endpoint, but I am facing the following error,

import json 
import io
import boto3 

client = boto3.client('runtime.sagemaker')

def lambda_handler(event, context):
    print("Received event: " + json.dumps(event, indent=2))
    
    #data = json.loads(json.dumps(event))
    #payload = data['data']
    print(json.dumps(event))
    
    response = client.invoke_endpoint(EndpointName='linear-learner-2019-12-12-16-20-56-788',
                                  ContentType='application/json',
                                  Body=(json.dumps(event)))
    return response

Output:

Function Logs:
START RequestId: 7f4c7589-b70f-4af8-834c-89a1a1fbe5e5 Version: $LATEST
Received event: {
  "instances": [
    {
      "features": [
        0.1,
        0.2
      ]
    }
  ]
}
{"instances": [{"features": [0.1, 0.2]}]}
An error occurred during JSON serialization of response: <botocore.response.StreamingBody object at 0x7f53918e2828> is not JSON serializable

Please help me resolve the issue, you can see the JSON input passed to the function. Not sure what is going wrong here. I even checked the cloudwatch logs not able to identify the origin of the issue.

Thanks in advance,
Arun

Edited by: NMAK on Dec 18, 2019 5:54 AM

NMAK
beantwortet vor 4 Jahren
0

I call Sagemaker from Lambda using a slightly different approach in terms of data structures:

final_data = ','.join(ordered_data.iloc[0].astype(str).values.tolist())
runtime = boto3.client('runtime.sagemaker')
response = runtime.invoke_endpoint(EndpointName='whatever-endpoint', 
                                           ContentType='text/csv',
                                           Body=final_data)
result = json.loads(response['Body'].read().decode())

I start with a dataframe of one row containing all the data.

beantwortet vor 4 Jahren
0

hello Javierlopez,

Thanks for your response, I really appreciate it.

I tried the below code and still it gives me same error

    payload = ','.join(str(item) for item in data['instances'][0]['data']['features'])
    #payload=bytearray(payload)
    print(payload)
    
    response = client.invoke_endpoint(EndpointName='linear-learner-2019-12-12-16-20-56-788',
                                  ContentType='text/csv',
                                  Body=payload)
    return response

below is output/error for the above code.

Function Logs:
START RequestId: 73daa03e-dec2-4d37-b779-72c1f70a7142 Version: $LATEST
Received event: {
  "instances": [
    {
      "data": {
        "features": [
          0.1,
          0.2
        ]
      }
    }
  ]
}
0.1,0.2 # this is the payload sent for inference.
An error occurred during JSON serialization of response: <botocore.response.StreamingBody object at 0x7f77555e59e8> is not JSON serializable

I really don't understand the logic behind this on what format the function expects the input, my colleague ran a different model with same JSON format I used and it works fine. I believe you used this is inference format for XGBoost. I am not able to find any documentation on linear-learner sample inference requests using lambda.

Could you please point me in the right direction.

Regards,
Arun

Edited by: NMAK on Dec 20, 2019 3:04 AM

NMAK
beantwortet vor 4 Jahren

Du bist nicht angemeldet. Anmelden um eine Antwort zu veröffentlichen.

Eine gute Antwort beantwortet die Frage klar, gibt konstruktives Feedback und fördert die berufliche Weiterentwicklung des Fragenstellers.

Richtlinien für die Beantwortung von Fragen