Loss of data from SQS FIFO queue

0

We have a FIFO Queue added as a trigger to lambda (consumer lambda) for processing 7000 records in order. The Issue is that some random messages are getting lost and not received at the consumer lambda's end. There is no log trace for the any error and no entries in the DLQ for the respective lost messages. If I reduce the input records to 3000 then this issues doesn't appear and each message is received in consumer lambda.

================================ Lambda configuration: Memory 4096MB Ephemeral storage 512MB Timeout 0min30sec

Queue configuration are default: Maximum message size 256 KB Message retention period 4 Days Default visibility timeout 5 Minutes Messages available 0 Delivery delay 0 Seconds Messages in flight (not available to other consumers) 0 Receive message wait time 10 Seconds Messages delayed 0 Content-based deduplication Disabled High throughput FIFO Disabled Deduplication scope Queue FIFO throughput limit Per queue

  • Could you give your Lambda Function concurrency and Unreserved account concurrency limit ? You can find this information in the Configuration > Concurrency tab of your consumer Lambda

Saumya
已提问 4 个月前221 查看次数
1 回答
0

Hi,

The AWS Lambda function can be triggered with multiple messages from the Amazon SQS queue (unless the Batch Size is set to zero in the Trigger configuration).

The Lambda function should loop through each record that is passed in the event parameter, for example:

exports.handler = async function(event, context) {
  event.Records.forEach(record => {
    const { body } = record;
    console.log(body);
  });
  return {};
}

So, is Batch Size set to 0 in your case? See https://docs.aws.amazon.com/lambda/latest/dg/with-sqs.html for details about batch size

Best,

Didier

profile pictureAWS
专家
已回答 4 个月前
  • Hi, Batch size is set to 1, hence we dont need loop in our consumer lambda.

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则