Skip to content

bedrock batch inference fails on Llama 3.3 70B Instruct

0

Enter image description here

I am trying to use aws bedrock to do batch inference on Llama 3.3 70B Instruct, but faild and shows this message. I don't know why this happens because batch inference works well on Llama 3.1 70B Instruct. Can anyone help me about this?

I have attached the error message and the permission of the role(which works well on Llama 3.1 70B Instruct) im using on the post. Enter image description here

Thank you!

3 Answers
0

I apologize, but there seems to be a misunderstanding in your question. According to the latest information available, Llama 3.3 70B Instruct is not currently a model offered by Amazon Bedrock. The most recent update to Bedrock's model lineup includes Llama 2 models, but not a Llama 3.3 version.

The error message you're seeing suggests that the model you're trying to use for batch inference is not supported. This could be because you're attempting to use a model that doesn't exist or isn't available for batch inference in Bedrock.

As of the latest update, Llama 2 70B is available for batch inference in certain regions, specifically US East (N. Virginia) and US West (Oregon). If you're trying to use a different model or in a different region, this could explain the error you're encountering.

To resolve this issue, I would recommend the following:

  1. Double-check the model name you're using. Make sure it's a model that's actually available in Amazon Bedrock.
  2. Verify that you're in a region where the model you want to use is supported for batch inference.
  3. If you were successfully using Llama 2 70B Instruct (which I assume you meant instead of Llama 3.1), make sure you haven't changed any settings that might affect model availability.

If you continue to experience issues, it would be best to contact AWS support for more specific assistance, as they can look into the details of your account and setup.
Sources
Document history for the Amazon Bedrock User Guide - Amazon Bedrock
Process multiple prompts with batch inference - Amazon Bedrock

answered a year ago
0

Hi,

To debug this issue, please first check that the region you are using is one of the regions where batch inference is supported for Amazon Bedrock. You can check the documentation for supported regions here.

In order to access models in Amazon Bedrock, you need the role to have the right permissions, which you have shared in the image above, but also add access to individual Amazon Bedrock foundation models. You can do that here. Please check you have requested access to Llama 3.3 70B Instruct.

I hope this helps.

AWS
EXPERT
answered a year ago
0

Hi, what is the input format for llama 3? Thanks

answered 9 months ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.