- Newest
- Most votes
- Most comments
To invoke a Step Function when an object is posted to S3, you'll need to create a Lambda function that starts an execution of the state machine. Each execution will be separate and independent for each uploaded object.
There isn't a concept of "enabled" and "disabled" for state machine executions - they are either running and in some state, or terminated.
You can find an experimental CDK solution construct that creates an S3 bucket connected to a Step Functions state machine here.
There is no direct integration between S3 and StepFunctions. You can send the S3 notification to a Lambda function, as suggest by @Michael_F, or a no code solution, by sending the event to EventBridge and setting a rule with a StepFunction target.
Saying that, I am not sure that StepFunctions is your right solution. It seems that you want to run some process on multiple files. If that is the case, you will not be able to "attach" into a running state machine. There may be some options using the Wait For Task Token integration pattern, but I am not sure it will solve what you are trying to do.
I think it would be best if you could explain your use case better.
Relevant content
- Accepted Answerasked 2 years ago
- AWS OFFICIALUpdated 2 years ago
- AWS OFFICIALUpdated a year ago
- AWS OFFICIALUpdated 6 months ago
@Michael_F - thanks. follow up question - if we need a lambda inbetween, it should be possible to check if state machine is already running/or in some state, right? instead of creating another instance of the state machine. and only start/run the same instance , if it is in terminated state.
There are a few ways to go about that. The naive way would be to probe all the running executions, but the DescribeExecution API method is eventually consistent, and that could lead to errors. Instead, I would recommend keeping a list of object processing states in DynamoDB (using the S3 object ID as the primary key).