Skip to content

Provide a payload greater than 256kb for step function http task

0

I want to use the newish StepFunction HTTP Task. My problem is that it requires the http body as a task input. In my use case this body can be greater than 256kb and hence goes into the task max. limit. In my current approach I use the claim-check-pattern and store the larger payload in S3, pass the reference and hydrate it in a lambda task and send the http request manually through axios. As I want to take advantage of the event bridge connection this solution is not great and that's why I would love to use the sfn http task with larger payloads.

Is this possible somehow?

asked 2 years ago1.7K views
2 Answers
1
Accepted Answer

Today, the solution that you're using is the best way to do what you need. S3 is not the only storage solution you could use but it is the one that I'd recommend in any case.

If you are chatting with your local AWS Solutions Architect, mention to them that larger payloads for Step Functions would be handy - we are always looking for feedback about our services.

That said: What would happen if we raised the limit to (say) 512kB? When you start to exceed that you might want the limit raised again. At which point do we say "no" such that you need to rearchitect? There will always be a hard limit somewhere and limits are generally set so as to protect you, the service and other customers while offering an experience which is performant and as cost effective as possible.

As above - you're doing what we recommend so continue to do that.

AWS
EXPERT
answered 2 years ago
AWS
EXPERT
reviewed 2 years ago
  • Struggling with a similar problem here. Also this seems to be along the same lines.

    The HTTPS Task integration is a great feature in StepFunctions, but it seems that in 2025 as API designers don't think much of bandwidth or storage, 256k may be a too harsh limit.

    As for where to draw the line, it's true that no matter how large the limit, there will always be someone who needs more.

    However, specifically for cases like mine where the http payload is not processed in StepFunctions, I would suggest extending the HTTP Task integration so that the HTTP payload (body section of request/response) be stored in specified S3 object(s) while the inter-state payload would contain only the headers and metadata. It makes sense that there would be a limit on the amount of data processed this way as well, but it would be decoupled from the general state I/O limit, so it can be set significantly higher.

0

The Step Functions manual recommends using Amazon S3 when payloads exceed 256 KB, but it doesn’t provide any transport or seamless mechanism for that. Ideally, all integrations -- such as AWS service calls that can produce large outputs -- should automatically store results in S3 and return the S3 URL in the JSON response. For example, the CloudFormation GetTemplate API can generate large payloads, so it would make sense for Step Functions to handle this seamlessly by saving the output to S3. Why is it that only Distributed Map supports reading from and writing to S3? And why doesn’t the Step Functions workflow itself offer such wrappers for all inter-step payload handling?

answered 3 months ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.