Skip to content

Follow up question to Data limits for bedrock runtime agents

0

I asked this question and then had a follow up question that didn't get answered: https://repost.aws/questions/QU7-EgU32nTCSZLLIubI4vUg/data-limits-for-bedrock-runtime-agents.

What are the memory constraints of the agent/where is the documentation if it varies depending on the agent? In terms of the rate limiting/quotas on the Bedrock service, to clarify, I'm specifically referring to providing information to the agent for code interpretation that would normally be included in attached files. Do you have any more clarity on what the limits are in that case for aggregate information being passed in through data from api call(s) resulting from any action groups/lambda functions.

Can you elaborate on your initial response to my question?

1 Answer
0

Based on the available information, I can provide some clarity on the memory constraints and quotas for AWS Bedrock Agents.

For Amazon Bedrock AgentCore, there are specific invocation limits documented:

  • Maximum payload size: 100 MB for request/response payloads
  • Streaming chunk size: 10 MB for individual chunks
  • Request timeout: 15 minutes for synchronous requests
  • Invocations per second: 25 per endpoint (this is adjustable)

Regarding your specific question about code interpretation and data passed through API calls from action groups/Lambda functions, there isn't explicit documentation on memory constraints for this specific use case in the provided sources.

When your agent retrieves data via API calls through custom action groups (typically implemented using Lambda functions), the data processing is primarily constrained by:

  1. The Lambda function's own memory and execution time limits
  2. The AgentCore payload size limits mentioned above (100 MB maximum)
  3. The rate limits for API calls (25 invocations per second per endpoint by default)

For code interpretation specifically, the sources don't provide detailed information about memory constraints when processing data retrieved from API calls. However, the general payload size limits would still apply to the data being passed between components.

If you're experiencing rate limiting issues with your agent, you may need to implement request rate limiting or queueing in your application, or potentially request a quota increase through AWS Support.
Sources
Data limits for bedrock runtime agents | AWS re:Post
Quotas for Amazon Bedrock AgentCore - Amazon Bedrock AgentCore
Bedrock Test Agent quota error | AWS re:Post

answered a month ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.