DynamoDB Stream Event Source for Java & Python lambda behaves differently with partial failures

0

Hi everyone,

I created 2 lamba (1 in Python & 1 in Java) from code https://docs.aws.amazon.com/lambda/latest/dg/with-ddb.html#services-ddb-batchfailurereporting The only difference is that I returned Partial failure for every records. Both lambdas were configured to be triggered by DynamoDB Stream with partial failure enable & 2 retry attempts (same configuration for both). However, while python code runs messages three times (first time & 2 retries) - Java lambda runs once (without any retry).

Is this a bug in AWS code or did I miss something?

질문됨 6달 전314회 조회
1개 답변
1

Without seeing your code it's impossible to tell. But my assumption is that you are using ReportBatchItemFailures without actually returning the failures, in which case no retries are happening as Lambda thinks everything was successful. For example:

try:
    # some Lambda logic
except Exception as e:
    return { "batchItemFailures": [ {"itemIdentifier": DYNAMODB RECORD SEQUENCE NUMBER} ]  }
profile pictureAWS
전문가
답변함 6달 전
  • Hi thanks for being interested in my question. Here are my code: Python:

    import json
    
    def handler(event, context):
        records = event.get("Records")
        curRecordSequenceNumber = "";
        
        for record in records:
            curRecordSequenceNumber = record["dynamodb"]["SequenceNumber"]
            print({"batchItemFailures":[{"itemIdentifier": curRecordSequenceNumber}]})
            return {"batchItemFailures":[{"itemIdentifier": curRecordSequenceNumber}]}
    
        return {"batchItemFailures":[]}
    

    Java:

    ... // Some import
    
    public class TestStreamHandler implements RequestHandler<DynamodbEvent, Serializable> {
        LambdaLogger logger;
        final static ObjectMapper objectMapper = new ObjectMapper();
    
        @Override
        public StreamsEventResponse handleRequest(DynamodbEvent input, Context context) {
            logger = context.getLogger();
            List<StreamsEventResponse.BatchItemFailure> batchItemFailures = new ArrayList<>();
            String curRecordSequenceNumber = "";
    
            for (DynamodbEvent.DynamodbStreamRecord dynamodbStreamRecord : input.getRecords()) {
                StreamRecord dynamodbRecord = dynamodbStreamRecord.getDynamodb();
                curRecordSequenceNumber = dynamodbRecord.getSequenceNumber();
    
                batchItemFailures.add(new StreamsEventResponse.BatchItemFailure(curRecordSequenceNumber));
                return new StreamsEventResponse(batchItemFailures);
            }
            return new StreamsEventResponse();
    }
    
  • Here is my event configuration:

    DynamoDB: TestStreamTable
    arn:aws:dynamodb:ap-southeast-1:<myaccount>:table/TestStreamTable/stream/2023-11-18T15:42:39.936
    state: Enabled
    Details
    Activate trigger: Yes
    Batch size: 100
    Batch window: None
    Concurrent batches per shard: 1
    Last processing result: OK
    Maximum age of record: -1
    On-failure destination: arn:aws:sqs:ap-southeast-1:<myaccount>:TestStreamFuncPythonSQS
    Report batch item failures: Yes
    Retry attempts: 2
    Split batch on error: Yes
    Starting position: TRIM_HORIZON
    Tumbling window duration: None
    UUID: <id>
    

    And for Java:

    DynamoDB: TestStreamTable
    arn:aws:dynamodb:ap-southeast-1:<myaccount>:table/TestStreamTable/stream/2023-11-18T15:42:39.936
    state: Enabled
    Details
    Activate trigger: Yes
    Batch size: 100
    Batch window: None
    Concurrent batches per shard: 1
    Last processing result: OK
    Maximum age of record: -1
    On-failure destination: arn:aws:sqs:ap-southeast-1:<myaccount>:TestStreamFuncJavaSQS
    Report batch item failures: Yes
    Retry attempts: 2
    Split batch on error: Yes
    Starting position: TRIM_HORIZON
    Tumbling window duration: None
    UUID: <id>
    

    Could you help to have a look?

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠