Needed .jsonl format for AWS bedrock batch interface for meta models

0

I am currently trying to implement bedrock batch interface. While creating .jsonl file will the format change for different models?? Also i needed API for monitoring the jobs. Currently there is SDK's for only C++,Java When we will be able to get SDK for JS??

asked 19 days ago40 views
2 Answers
0

Yes, the format of the .jsonl file can change depending on the model you're using for AWS Bedrock batch inference. For Meta models, as well as Anthropic models, the format should follow a specific structure.

Each line in your .jsonl file should be a JSON object with the following format:

{ "recordId": "11 character alphanumeric string", "modelInput": {JSON body} }

The "recordId" field is optional but can be useful for tracking individual records. The "modelInput" field is crucial and should contain the actual input for the model.

For Meta models specifically, you would structure the "modelInput" to match the requirements of the particular Meta model you're using. This typically includes the prompt or input text you want the model to process.

It's important to note that while you can technically add custom fields to your input records, these will be ignored during the batch inference process. The job will focus on the "modelInput" content to generate the output.

Regarding monitoring the jobs, AWS Bedrock provides APIs for managing and monitoring batch inference jobs. While there isn't a specific JavaScript SDK mentioned for Bedrock at the moment, you can use the AWS SDK for JavaScript (v3) to interact with Bedrock services. This SDK allows you to create, describe, and list batch inference jobs.

For example, you can use the CreateModelInvocationJob, GetModelInvocationJob, and ListModelInvocationJobs operations to manage your batch inference jobs programmatically.

As for when a dedicated JavaScript SDK for Bedrock will be available, AWS regularly updates and expands its SDK offerings. However, there's no specific timeline provided for a JavaScript SDK dedicated to Bedrock. In the meantime, you can use the AWS SDK for JavaScript (v3) to interact with Bedrock services.
Sources
Can you specify custom data in the input bucket's jsonl records when doing batch inferencing? | AWS re:Post
Enhance call center efficiency using batch inference for transcript summarization with Amazon Bedrock | AWS Machine Learning Blog
Format and upload your batch inference data - Amazon Bedrock

profile picture
answered 19 days ago
0

❓ AWS Bedrock Batch Inference: How to Preserve Custom recordId and Metadata in .jsonl Input?

I'm using AWS Bedrock batch inference to process prompts in bulk. Until yesterday, I was able to use custom recordId values in the .jsonl input file and they were properly reflected in the output.

However, today when I run the same process (with 100 records), the response JSONL is assigning recordIds automatically from 1 to 100, ignoring my custom ones.

❗ Problem:

My .jsonl file contains entries like this:

{
  "recordId": "txn_123_v1",
  "input": {
    "prompt": "Extract the carpet area from the following description: ..."
  }
}

But in the output file, I’m seeing recordIds like 1, 2, ..., 100, instead of my custom IDs.

💡 Goal:

I want to preserve custom recordIds and possibly include additional metadata (like transaction IDs, village names, etc.) in the input and get them reflected in the output. This helps us track prompts, versions, and results cleanly.

❓ Questions:

  1. Is there a specific way to format the .jsonl input to ensure that custom recordIds are preserved in the output?
  2. Is it possible to include extra metadata in the input (e.g., under a metadata field), and will Bedrock include it in the response?
  3. Is there any recent change in Bedrock batch processing that may cause this shift in behavior?

answered 12 days ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions