You'll need some sort of external (i.e.outside of Lambda) data store to keep track of what is currently being processed.
If it were me, I'd create a DynamoDB table and store a record in the table for each document that has started processing. I'd also check at the beginning of processing to see if there was already a record for the document that is about to be processed. At that point, also check the timestamp. At the end of processing, delete the record.
There is a possibility here of a race condition where one file is uploaded and has started processing and then another file is uploaded while the first is still in progress. I'm not sure how you would want to handle that.
Finally, in the event that processing failed I would put a TTL on the DynamoDB record so that it automatically is deleted and unblocks future processing.
I suspect there are some other requirements that will come out of this - I'd encourage you to reach out to your local AWS Solutions Architect to discuss what the best way forward is.
Lambda MSK trigger not workingasked 2 years ago
S3 Lambda Function slower than event triggerAccepted Answerasked 3 years ago
Using MSK as trigger to a Lambda with SASL/SCRAM Authenticationasked 7 months ago
pass custom event to a cognito lambda trigger : Post authentication Lambda triggerasked 2 months ago
Missing MSK service icon within Lambda Function 'Add Trigger' dropdownasked 5 months ago
Looking for any process locking or latest file processing with msk and lambdaasked 2 months ago
Kinesis Data Analytics with MSK and Lambdaasked 2 months ago
Trigger an AWS lambda function with an ftp/ sftp eventasked 2 months ago
Serverless file processingasked 8 months ago
can a s3 object creation event trigger an existing fargate job?asked 4 months ago