- Newest
- Most votes
- Most comments
Hi Matt.
The issue you're facing with large inputs into Bedrock Prompt Flow is likely due to input size limits or other quota restrictions, such as maximum input size or processing time. Bedrock enforces strict quotas that aren't adjustable, which could cause the internal server errors you're experiencing when working with large S3 files.
To work around this, you could pre-process or reduce the input size before feeding it into the prompt flow. If this isn't feasible, you might need to explore alternative services better suited for handling large data inputs.
For further details, refer to the AWS documentation on quotas and Prompt Flows.
Thank you for your reply. For reference, the input tokens tend t average from around 4000 with no additional context added, to around 6-7k, if the user adds context files. The output is generally 600-1000. (It's another app I built that we wanted to try recreating with AWS). Does this seem like it should cause those timeouts?
Relevant content
- asked 2 months ago
- asked 2 days ago
- Accepted Answerasked 2 months ago
- AWS OFFICIALUpdated 5 months ago
- AWS OFFICIALUpdated 3 years ago
- AWS OFFICIALUpdated 5 months ago
- AWS OFFICIALUpdated 5 months ago