- Newest
- Most votes
- Most comments
Based on the available information, I can provide some clarity on the memory constraints and quotas for AWS Bedrock Agents.
For Amazon Bedrock AgentCore, there are specific invocation limits documented:
- Maximum payload size: 100 MB for request/response payloads
- Streaming chunk size: 10 MB for individual chunks
- Request timeout: 15 minutes for synchronous requests
- Invocations per second: 25 per endpoint (this is adjustable)
Regarding your specific question about code interpretation and data passed through API calls from action groups/Lambda functions, there isn't explicit documentation on memory constraints for this specific use case in the provided sources.
When your agent retrieves data via API calls through custom action groups (typically implemented using Lambda functions), the data processing is primarily constrained by:
- The Lambda function's own memory and execution time limits
- The AgentCore payload size limits mentioned above (100 MB maximum)
- The rate limits for API calls (25 invocations per second per endpoint by default)
For code interpretation specifically, the sources don't provide detailed information about memory constraints when processing data retrieved from API calls. However, the general payload size limits would still apply to the data being passed between components.
If you're experiencing rate limiting issues with your agent, you may need to implement request rate limiting or queueing in your application, or potentially request a quota increase through AWS Support.
Sources
Data limits for bedrock runtime agents | AWS re:Post
Quotas for Amazon Bedrock AgentCore - Amazon Bedrock AgentCore
Bedrock Test Agent quota error | AWS re:Post
Relevant content
- asked a month ago
