- Mais recentes
- Mais votos
- Mais comentários
Just use a an SQS FIFO queue and assign the GroupID to all messages. Process the queue with a Lambda function. Because you use the same group ID, there will never be more than one consumer.
One option would be to put a lambda between the step function and queue and use it to dequeue the message and pass it to step functions for further processing. That way you can control the concurrency in the lambda.
Another option would be Within the Step Functions you can implement a logic to check how many instances of it are running. You can do this by using ListExecutions for StepFunctions with a filter for RUNNING. If there are 2 instances ( current and another one, then you can put the message back into the queue) and stop the process, else you can continue. You can refer to Serverlesspresso Workshop at https://catalog.us-east-1.prod.workshops.aws/workshops/28e7066a-b0bb-42ad-a0e9-8e8eeeb51133/en-US/1-workflow/4-capacity for more details on the Check Capacity step.
Conteúdo relevante
- AWS OFICIALAtualizada há um ano
- AWS OFICIALAtualizada há um ano
- AWS OFICIALAtualizada há 2 anos
But the SQS source can spawn many Lambda functions for each message.
Also, we cannot run the actual process in a Lambda, as it's a long-running "build" process which happens inside CodeBuild.
So at best, we can trigger a Step Function, or CodeBuild directly, as a result of an incoming message. But if the next message is ingested, and processed, it will then have to be put back in the queue, if the process is already running, which will lead to wasteful reading and Lambda executions.
If your task runs in Lambda, what I suggested works. However, if your Lambda invokes a different process. you will need some other mechanism like Step Functions with concurrency controls that you implement using DynamoDB for example. Not that you can invoke a state machine without using Lambda by using EventBridge Pipes,