- Newest
- Most votes
- Most comments
Hello,
The Amazon Bedrock batch inference API support model input in the format of the InvokeModel API.
The format of the modelInput JSON object must match the body field for the model that you use in the InvokeModel request. https://docs.aws.amazon.com/bedrock/latest/userguide/batch-inference-data.html
InvokeModel request bodies can't include documents.
You can't include a document in the request body. To include a document during inference, use the Chat/text playground in the AWS Management Console or send a Converse request. https://docs.aws.amazon.com/bedrock/latest/userguide/inference-invoke.html
It appears that you're encountering an issue with the document format specification in your batch inference request for Amazon Nova Lite. While CSV is indeed one of the supported document formats for Amazon Nova models, the error message suggests that there might be a discrepancy between the expected input format for batch inference and the format you're using.
For batch inference with Amazon Bedrock models, including Nova Lite, the input format is typically more streamlined compared to the conversational API format you're using. The batch inference input usually doesn't include the full conversation structure with system prompts and multiple messages.
Instead, try simplifying your input to focus on the essential elements needed for inference. Here's a suggested format for your batch inference input:
{ "recordId": "RECORD_ID", "modelInput": { "prompt": "YOUR_PROMPT_HERE", "document": { "format": "csv", "source": { "bytes": "CSV_IN_BASE64_STRING" } } } }
In this format:
- Include the "recordId" if you want to track individual records.
- Under "modelInput", provide a "prompt" field with your instruction or question.
- Include the "document" object with the "format" and "source" as you had before.
This simplified structure should be more in line with what the batch inference endpoint expects. If you need to include system instructions or additional context, consider incorporating them into the prompt text.
If you continue to experience issues after adjusting your input format, it may be worth double-checking a few other points:
- Ensure that the base64 encoding of your CSV is correct.
- Verify that the CSV content adheres to the size limits (should not exceed 4.5MB per document).
- Confirm that you're not exceeding the maximum of 5 documents per request.
If the problem persists after these adjustments, it would be advisable to contact AWS Support for more specific guidance, as there might be nuances to the Nova Lite batch inference process that aren't fully documented or that have changed recently.
Sources
Document understanding - Amazon Nova
Error in AWS Bedrock Batch Inteference | AWS re:Post
Can you specify custom data in the input bucket's jsonl records when doing batch inferencing? | AWS re:Post
This is not accurate. I get the following error with this suggestion:
Malformed input request: #: required key [messages] not found, please reformat your input and try again.
Relevant content
- asked 12 days ago
- asked a month ago
Could you please write me an email to malterei@amazon.ch so that I can register your feedback for adding document support in Amazon Bedrock batch inference?