Lambda with multiple events dependencies

0

Hello,

Which is the serverless standard approach to handle a Lambda invocation that depends on multiple events ?

Is the below scenario a good approach ?

  • Lambda1 is triggered by an S3 event (with S3 event parameter)
  • Lambda1 needs the results that Lambda2 inserts into DynamoDb -> Lambda1 have to run after success event from Lambda2 (some sort of an OnSuccess Destination of Lambda2)
  • so, Lambda1 should start after s3_event + lambda_onsuccess_event
  • is it a good approach to use the Asynchronous invocation Configuration for Lambda1, where Lambda1 will throw Exception if it does not found in DynamoDb the result from Lambda2 ? Asynchronous invocation Configuration = Retry attempts + Maximum age of event + Lambda Dead-letter queue.

Sorry for a late clarification with below design (the events has the same root event):
the events has the same root event

Thank you,
Mihai

asked 8 months ago297 views
2 Answers
1
Accepted Answer

You did not indicate what triggers Lambda2.

If the trigger for Lambda2 is also based on the same S3 object, you can invoke Lambda1 as Lambda Destination from Lambda2, or, trigger it from the DynamoDB stream.

If the two function triggers are unrelated, you will need to write some code in Lambda1. You will trigger the function from both the S3 event as well as the DDB change (either as a Lambda destination from Lambda2 or DDB stream). When the function is invoked it checks why. If it was invoked due to S3, it will check DDB for the right value. If it was invoked from DDB, it will check S3 for the right object. If the check was successful, the function continues. If the item is not found, the function exits.

Using the approach you described will result in a more complex solution, many messages going to the DLQ as well as higher cost.

profile pictureAWS
EXPERT
Uri
answered 8 months ago
  • Hello,

    Thank you for the reply and the solution !

    As presented in the above architecture, Lambda2 is not directly based on the same S3 object (the initial event is the same), but on another S3 object generated by another Lambda function. So, it could be applied the solution with some written code.

    Thank you,
    Mihai

0

Hi Mihai,

The right way to orchestrate such workflows are Step Functions: https://aws.amazon.com/step-functions/

With Step Functions, you can develop very sophisticated combination of events, Lambda, human interactions, etc. with loops, conditions, parallel progress, etc...

The graphical interface to manage the progress of the execution of a given workflow is very efficient to monitor how tasks progress.

Best,

Didier

profile pictureAWS
EXPERT
answered 8 months ago
profile picture
EXPERT
reviewed 8 months ago
  • Hello,

    Thanks a lot for the clean solution, but at this moment I am in a later phase of a started project where the design is as described.

    Which is the best approach in such a situation ?

    Thank you, Mihai

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions