- Newest
- Most votes
- Most comments
Hello.
It has been reported that several users have had their Bedrock quota set to 0 for the past few days.
I think this is probably related.
https://repost.aws/questions/QUK8qnLwJRQhOPV58H0sC41Q/bedrock-too-many-requests-please-wait-before-trying-again
In this case, the user cannot resolve the issue, so they must contact AWS Support using the URL below.
https://console.aws.amazon.com/support
Based on the information provided, there are a few potential reasons why you're unable to request access to the Llama 3.1 70B model in the US West (Oregon) region (us-west-2) from Australia:
-
Regional availability: As of the latest information available, Llama 3.1 models (including the 70B version) are generally available in the US West (Oregon) Region. However, availability can sometimes be limited or rolled out gradually.
-
Account restrictions: Some models may not be available to all AWS accounts, especially if your account is new or has certain restrictions.
-
Geographical limitations: While you're trying to access the model from a US region, your physical location in Australia might be affecting your ability to request access. Some AI models have geographical restrictions due to various reasons, including regulatory compliance.
-
UI issues: Sometimes, the inability to select a model or see the request access button can be due to user interface problems or browser issues.
To resolve this issue, you can try the following steps:
-
Ensure you're logged in with the appropriate permissions, ideally using the root account or an IAM user with sufficient privileges.
-
Try accessing the model from a different AWS region, such as US East (N. Virginia), which often has the widest selection of services available.
-
Clear your browser cache and cookies, or try using a different browser.
-
If the issue persists, it's recommended to contact AWS Support. They can provide specific information about your account's eligibility for accessing the Llama 3.1 70B model and any potential restrictions that may be in place.
Remember that model availability can change, and AWS regularly updates its offerings. If you're unable to resolve the issue through these steps, reaching out to AWS Support would be the best course of action for a definitive answer regarding your specific situation.
Sources
Announcing Llama 3.1 405B, 70B, and 8B models from Meta in Amazon Bedrock | AWS News Blog
No model request possible | AWS re:Post
Bedrock Claude Model Access Request Failure | AWS re:Post
Relevant content
- asked 7 months ago
- asked a year ago
- AWS OFFICIALUpdated 6 months ago
