Are there limitations with Lambda triggers on DynamoDB when using a batch size of 10?

0

I have an AWS Lambda function connected to a DynamoDB trigger. I've noticed that not all records inserted into my DynamoDB table are reaching my Lambda function. I have configured the batch size to a maximum of 10 records. Are there any known limitations or issues with Lambda triggers on DynamoDB that could cause this behavior?

japacx
asked 4 months ago535 views
1 Answer
0

Hi,

This page will give you some hints on why not all records inserted into your DynamoDB table are reaching your Lambda function: https://docs.aws.amazon.com/lambda/latest/dg/with-ddb.html#services-dynamodb-errors

It says:

**Before invocation**: If a Lambda event source mapping is unable to invoke the function 
due to throttling or other issues, it retries until the records expire or exceed the maximum 
age configured on the event source mapping (MaximumRecordAgeInSeconds).

**During invocation**: If the function is invoked but returns an error, Lambda retries 
until the records expire, exceed the maximum age (MaximumRecordAgeInSeconds), 
or reach the configured retry quota (MaximumRetryAttempts). For function errors, 
you can also configure BisectBatchOnFunctionError, which splits a failed batch into 
two smaller batches, isolating bad records and avoiding timeouts. Splitting batches 
doesn't consume the retry quota.

To improve your resiliency in the above situations, next section on the page "Configuring destinations for failed invocations" proposes options to handle the failed invocations.

Best,

Didier

profile pictureAWS
EXPERT
answered 4 months ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions