Endpoints test inference gives error 415

0

Hi,

I deploy a abalone test model, and now I want to test it through Enpoint Test Inference, it gives me an error:

Error invoking endpoint: Received client error (415) from primary with message "application/json is not an accepted ContentType: csv, libsvm, parquet, recordio-protobuf, text/csv, text/libsvm, text/x-libsvm, application/x-parquet, application/x-recordio-protobuf.". See https://ap-northeast-1.console.aws.amazon.com/cloudwatch/home?region=ap-northeast-1#logEventViewer:group=/aws/sagemaker/Endpoints/abaloneTest in account 556267670448 for more information.

I understand the error contentType is not accepting value "application/json", but I am not able to find out how can i update the default value from "application/json" to any permitted value and test endpoint with test inference ui only? It will be good if ui provides a sample request format, so that we can test it fast.

asked a year ago307 views
2 Answers
0

Hi and thanks for reaching out,

Today, the endpoint test UI in SageMaker Studio only supports application/json content, so can't be used with endpoints like yours that don't support this content type: You'll need to use code, for example from a notebook or terminal.

From Python in particular you would have two options:

  1. The low-level boto3 SDK as used here, which maps directly to the underlying InvokeEndpoint API and requires you to serialize your content (e.g. to CSV string) yourself, or
  2. The high-level sagemaker SDK, which might be more familiar if you've followed tutorials with e.g. the Estimator / Predictor syntax.

Either is fine really... The high-level SDK just packages the process of stringifying and de-stringifying objects for the HTTP(S) request using the serializer and deserializer classes - so you would set your predictor up with the appropriate de/serializer and then just pass in the data as your usual e.g. DataFrame or nested list format:

import pandas as pd
import sagemaker

predictor = sagemaker.Predictor(
    "your-endpoint-name",
    serializer=sagemaker.serializers.CSVSerializer(),
    deserializer=sagemaker.deserializers.CSVDeserializer(),
)

df = pd.read_csv("small-dataframe.csv")
result = predictor.predict(df)

Alternatively if having the test UI is important for your use-case, you could explore other SageMaker algorithms that support JSON I/O? If you're using XGBoost you could consider using it in "framework mode" instead of "algorithm mode", letting you provide a custom inference script to parse JSON or whatever other request formats you'd like. If you're using AutoGluon-Tabular or one of the other "JumpStart-based" algorithms, you may be able to download, modify, and re-upload the inference deploy_source_uri (see this example) to likewise accept custom request/response types.

AWS
EXPERT
Alex_T
answered a year ago
0

Thanks @Alex for update. It will be good if AWS test endpoint UI also support another types as well. As every new developer/learner starts with Abalone example, so they are expecting result on UI interface as well. This is the very basic and initial requirement of AWS test endpoint UI.

Thanks again.

answered a year ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions