1 Answer
- Newest
- Most votes
- Most comments
-1
What kind of issues you are experiencing ?
Make sure your Lambda has got permissions to the S3 and to DynamoDB.
Remember this has to be a Synchronous operation otherwise it will fail.
If too much changes are performed you can decouple the feature to an SQS Queue and process it with a small delay.
Relevant content
- asked 2 years ago
- asked 4 months ago
- AWS OFFICIALUpdated 3 years ago
- AWS OFFICIALUpdated 3 years ago
- AWS OFFICIALUpdated 10 months ago
- AWS OFFICIALUpdated 3 years ago
I have implemented user-level restrictions for SFTP, limiting each user to 1 GB of storage. My Lambda function monitors S3 events triggered by SFTP uploads and updates DynamoDB to track each user's storage usage. However, if a user uploads a 2 GB file in one go, the SFTP system won't restrict the upload immediately. The event is generated only after the file is stored in S3, which then updates DynamoDB. I need a solution that directly restricts users from uploading files larger than 1 GB to SFTP.