Ensuring Sequential Message Processing in Node.js Application using SNS

0

I have a Node.js application that acts as a connector, facilitating message exchange between WhatsApp and another application, let's call it CCM. Here's how it works:

  • The connector receives a message from WhatsApp.
  • It forwards this message to CCM.
  • CCM publishes the message to an SNS topic.
  • Upon successfully publishing the message, CCM sends a 200 OK response to the connector, indicating successful asynchronous communication.
  • CCM also subscribes to the same SNS topic to receive the published message. Like one endpoint of CCM will publish message to SNS and another endpoint will get that message and process it.
  • Upon receiving the message, CCM initiates a chat using Amazon Connect Chat APIs.

Flow Diagram 1 Flow Diagram 2

However, I'm encountering an issue: when multiple messages are sent in quick succession (2-3 messages together), CCM starts multiple chats for the same customer. This happens because CCM takes some time to process the first message, and meanwhile, the subsequent messages arrive quickly. Processing messages contains finding customer from db against the information contains in that message etc.

Questions:

  • Is this behavior normal for the pub-sub system?
  • Can anyone suggest solution of this problem?
  • Shouldn't CCM wait for the acknowledgment of the first message before processing subsequent ones?
1 Answer
0

Yes, this is normal behavior for SNS. SNS will send each message, as it gets it, to its subscribers. If you worry about the order, you should use SNS FIFO -> SQS FIFO and consume the messages from the queue, one at a time.

If you have only a single subscriber (CCM), you may not need SNS at all in the architecture. Let the CCM connector send the message to SQS FIFO, using the chat ID as the Message Group ID in SQS and the CCM endpoint consume the messages from the queue. If you can't modify the CCM endpoint to poll for messages in the queue, you can use Amazon EventBridge Pipes to poll the messages from the queue and forward them to an API Destination.

profile pictureAWS
EXPERT
Uri
answered 2 months ago
profile picture
EXPERT
reviewed 25 days ago
  • Thank you for your response and the suggestions provided. I've implemented the approach using SQS with EventBridge Pipes as you recommended. However, I've encountered an issue regarding the processing time exceeding the maximum client execution timeout of 5 seconds enforced by EventBridge for requests to API destination endpoints.

    Given that my process for messages takes more than 5 seconds to complete(for the first message with which I start a chat using Connect API's), the same messages are being resent after the first message due to timeouts.

    Considering this scenario, I'm exploring alternative solutions to ensure sequential message processing without being constrained by the 5-second timeout limitation. Here are some specific points I'd like to address and discuss further:

    Alternative Timeout Handling: Are there any strategies or configurations within EventBridge or other AWS services that can accommodate longer processing times without hitting the timeout limit? For instance, is it possible to extend the timeout or implement a mechanism to handle longer processing times gracefully?

    Architectural Adjustments: Given the constraints of the current implementation, are there any architectural adjustments or alternative AWS services that would better suit my use case?

    I'm eager to explore any additional options or suggestions you may have to address this issue effectively. Thank You

  • You can use Pipes -> Lambda -> CCM. In tis case, the function can run up to 15 minutes. It will increase your cost, but it will eliminate the 5 second timeout.

    As I said before, if you can change CCM, let it poll the queue instead of pushing messages to it. If you can do that you will eliminate the API destination/Lambda.

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions