By using AWS re:Post, you agree to the AWS re:Post Terms of Use

I can't activate LLM models in AWS Bedrock

0

Hi,

I'm trying to enable different LLM models in Bedrock. However, I'm always getting the same error:

Failed to create regional entitlement. Account is not authorized for the model

I have verified that the user I'm using has the right permissions (I even tried with the root account). Also, I have tried the activation process in multiple regions just to make sure it's not a regional issue. In particular, I tried eu-central-1 (Frankfurt) and in us-east-1 (N. Virginia).

Does this sound familiar to anybody?

Thanks in advance!

profile picture
asked a month ago158 views
4 Answers
2

The error you're encountering, "Failed to create regional entitlement. Account is not authorized for the model," suggests that there might be an issue with your account's authorization to access specific models in Amazon Bedrock. This is a common challenge that users face when trying to enable LLM models.

To address this issue, consider the following steps:

  1. Verify model availability: Ensure that the models you're trying to access are available in the region you're working in. Some models may not be accessible in all AWS regions.

  2. Check model access status: In the Amazon Bedrock console, navigate to the "Model access" page. Here, you should be able to see the access status for different models. Look for "Access granted" status for the models you want to use.

  3. Request model access: If you don't see "Access granted" for the desired models, you may need to explicitly request access. Use the "Modify model access" option in the Bedrock console to request access to the specific models you need.

  4. IAM permissions: While you mentioned checking permissions, it's worth double-checking that your IAM user or role has the necessary permissions, including the "AmazonBedrockFullAccess" policy or equivalent custom policy that allows managing access to Bedrock models.

  5. Account restrictions: In some cases, there might be account-level restrictions preventing access to certain models. If you've tried with the root account and still face issues, it may be worth contacting AWS Support to check for any account-specific restrictions.

  6. Region selection: Although you've tried multiple regions, ensure you're in the correct region when requesting access and when trying to use the models. The US East (N. Virginia) region (us-east-1) is often a good starting point as it typically has the widest range of available models.

  7. Wait time: After requesting access, it may take several minutes for the access to be granted. Be patient and check back after a short while.

If you've gone through these steps and are still encountering issues, it would be advisable to open a support case with AWS to investigate any account-specific problems that might be preventing model access.

Remember that Amazon Bedrock is a relatively new service, and some features or models might still be in preview or have limited availability. Always check the latest documentation for any updates or known issues.

Sources
Community | Getting started with different LLMs on Amazon Bedrock
Getting started with Amazon Bedrock - Amazon Bedrock
Resolve InvokeModel API error in Amazon Bedrock | AWS re:Post

profile picture
answered a month ago
profile picture
EXPERT
reviewed a month ago
1

Thanks Adeleke,

I did open a support ticket, but I haven't heard back from AWS :S. I guess I'll have to wait a few days.

Thank you anyway!

profile picture
answered a month ago
  • hi there, have same issue here, did you get any help?

1

Hello cjuega

I think the error you are getting seems to be with issue with you account. the best way is to open a support case to aws to check you account because on AWS repost we can't do this check

Thanks BR

profile picture
EXPERT
answered a month ago
0

Hey, I opened a support ticket to AWS. There seemed to be something wrong with my account that couldn't be addressed from the customer space.

After a few days, AWS managed to solve it at their end; and I can now use LLM models.

Posting here in case anybody else experiences the same problem.

profile picture
answered 19 days ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions