Skip to content

Region: us-west-2, Model Access Status: Unavailable

0

I tried to request access to Llama3.1 70B from the region us-west-2. I am in Australia. However, the model access status is unavailable and I cannot select the model and request for model access.

2 Answers
3

Hello.

It has been reported that several users have had their Bedrock quota set to 0 for the past few days.
I think this is probably related.
https://repost.aws/questions/QUK8qnLwJRQhOPV58H0sC41Q/bedrock-too-many-requests-please-wait-before-trying-again

In this case, the user cannot resolve the issue, so they must contact AWS Support using the URL below.
https://console.aws.amazon.com/support

EXPERT
answered a year ago
EXPERT
reviewed a year ago
1

Based on the information provided, there are a few potential reasons why you're unable to request access to the Llama 3.1 70B model in the US West (Oregon) region (us-west-2) from Australia:

  1. Regional availability: As of the latest information available, Llama 3.1 models (including the 70B version) are generally available in the US West (Oregon) Region. However, availability can sometimes be limited or rolled out gradually.

  2. Account restrictions: Some models may not be available to all AWS accounts, especially if your account is new or has certain restrictions.

  3. Geographical limitations: While you're trying to access the model from a US region, your physical location in Australia might be affecting your ability to request access. Some AI models have geographical restrictions due to various reasons, including regulatory compliance.

  4. UI issues: Sometimes, the inability to select a model or see the request access button can be due to user interface problems or browser issues.

To resolve this issue, you can try the following steps:

  1. Ensure you're logged in with the appropriate permissions, ideally using the root account or an IAM user with sufficient privileges.

  2. Try accessing the model from a different AWS region, such as US East (N. Virginia), which often has the widest selection of services available.

  3. Clear your browser cache and cookies, or try using a different browser.

  4. If the issue persists, it's recommended to contact AWS Support. They can provide specific information about your account's eligibility for accessing the Llama 3.1 70B model and any potential restrictions that may be in place.

Remember that model availability can change, and AWS regularly updates its offerings. If you're unable to resolve the issue through these steps, reaching out to AWS Support would be the best course of action for a definitive answer regarding your specific situation.
Sources
Announcing Llama 3.1 405B, 70B, and 8B models from Meta in Amazon Bedrock | AWS News Blog
No model request possible | AWS re:Post
Bedrock Claude Model Access Request Failure | AWS re:Post

answered a year ago
EXPERT
reviewed a year ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.