- Newest
- Most votes
- Most comments
You did not indicate what triggers Lambda2.
If the trigger for Lambda2 is also based on the same S3 object, you can invoke Lambda1 as Lambda Destination from Lambda2, or, trigger it from the DynamoDB stream.
If the two function triggers are unrelated, you will need to write some code in Lambda1. You will trigger the function from both the S3 event as well as the DDB change (either as a Lambda destination from Lambda2 or DDB stream). When the function is invoked it checks why. If it was invoked due to S3, it will check DDB for the right value. If it was invoked from DDB, it will check S3 for the right object. If the check was successful, the function continues. If the item is not found, the function exits.
Using the approach you described will result in a more complex solution, many messages going to the DLQ as well as higher cost.
Hi Mihai,
The right way to orchestrate such workflows are Step Functions: https://aws.amazon.com/step-functions/
With Step Functions, you can develop very sophisticated combination of events, Lambda, human interactions, etc. with loops, conditions, parallel progress, etc...
The graphical interface to manage the progress of the execution of a given workflow is very efficient to monitor how tasks progress.
Best,
Didier
Hello,
Thanks a lot for the clean solution, but at this moment I am in a later phase of a started project where the design is as described.
Which is the best approach in such a situation ?
Thank you, Mihai
Relevant content
- Accepted Answerasked a year ago
- asked a year ago
- Accepted Answerasked 4 years ago
- asked a year ago
- AWS OFFICIALUpdated a year ago
- AWS OFFICIALUpdated 7 months ago
Hello,
Thank you for the reply and the solution !
As presented in the above architecture, Lambda2 is not directly based on the same S3 object (the initial event is the same), but on another S3 object generated by another Lambda function. So, it could be applied the solution with some written code.
Thank you,
Mihai