- Newest
- Most votes
- Most comments
Based on the information you've provided, it seems you're encountering an issue with cross-region inference for the Anthropic Claude 3.5 Haiku model in Amazon Bedrock. Here are a few points to consider and potential solutions:
-
Cross-region inference profiles: For Anthropic Claude models, you need to use a cross-region inference profile. These profiles are system-defined and named after the model they support and the regions they include.
-
Correct inference profile: Make sure you're using the correct inference profile for the Anthropic Claude 3.5 Haiku model. The profile should include both us-east-1 and us-west-2 regions.
-
IAM permissions: While you've attached the AmazonBedrockFullAccess policy, you might need to explicitly allow cross-region inference API actions. Ensure your IAM role has the necessary permissions to use cross-region inference, particularly for the
InvokeModel*APIs in both us-east-1 and us-west-2 regions. -
API usage: When using the Converse API with a cross-region inference profile, you should specify either the ARN or the ID of the inference profile, not the model ARN directly.
-
Region selection: Ensure you're calling the inference profile from one of the regions it includes (in this case, us-east-1 or us-west-2).
-
Access request: For Anthropic Claude models, there's a separate process to request access. Make sure you've submitted the use case details form through the Amazon Bedrock console and have been granted access specifically for the Anthropic Claude 3.5 Haiku model.
-
Verify access: After being granted access, you can test it using the Text or Chat feature on the console or through an API call.
If you've confirmed all these points and are still encountering issues, you may need to contact AWS support for further assistance, as there might be a specific configuration or account-level issue that needs to be addressed.
Sources
Access Anthropic models on Amazon Bedrock | AWS re:Post
Getting started with cross-region inference in Amazon Bedrock | AWS Machine Learning Blog
Use an inference profile in model invocation - Amazon Bedrock
Supported Regions and models for inference profiles - Amazon Bedrock
The one thing I missed was requesting access via the Amazon Bedrock Console. I had assumed that if I had quota for the model, that I would also already have access which was an incorrect assumption. After requesting and getting that access, my issue was resolved
Relevant content
- asked 4 months ago
- asked 2 years ago
- AWS OFFICIALUpdated 5 months ago
- AWS OFFICIALUpdated 6 months ago
