- Newest
- Most votes
- Most comments
As mentioned in the above answer, AWS Transfer Family provides fully managed SFTP connectors to directly move files between S3 and remote SFTP servers. You can find more details on the same in documentation here - https://docs.aws.amazon.com/transfer/latest/userguide/creating-connectors.html
Please note that the above connectors automatically emit a status event to EventBridge after every file transfer operation. You can filter successful events to automatically trigger file movement to the archive directory (post transfer success), and can use failure events to send a warning message as needed.
Your approach to using AWS Lambda for transferring files from an S3 bucket to a remote SFTP server is a good one, especially for serverless architectures. Here's a step-by-step guide to implement this solution:
- Set up an S3 bucket event trigger:
- Configure your S3 bucket to send an event notification when a new .CSV file is uploaded to a specific folder.
- This event will trigger your Lambda function.
- Create a Lambda function:
- Use a language supported by AWS Lambda (e.g., Python, Node.js) to write your function.
- The function should: a. Retrieve the file from S3 b. Connect to the remote SFTP server c. Upload the file to the SFTP server d. Move the file to a 'sent' directory in S3 upon successful transfer e. Send an email notification if the transfer fails
- Set up necessary permissions:
- Create an IAM role for your Lambda function with permissions to: a. Read from and write to your S3 bucket b. Access AWS Secrets Manager (to store SFTP credentials securely) c. Send emails through Amazon SES (for failure notifications)
- Use AWS Transfer Family:
- While not strictly necessary, you might consider using AWS Transfer Family to simplify the SFTP connection process.
- Create an SFTP connector using AWS Transfer Family, which can handle the connection to your remote SFTP server.
- Implement error handling and retries:
- Add logic to retry failed transfers a certain number of times before sending a failure notification.
- Monitor and log:
- Use AWS CloudWatch to monitor your Lambda function's performance and log any issues.
Potential pitfalls and considerations:
-
File size limitations: Lambda has execution time and memory limits. For large files, you might need to implement multipart uploads or consider using AWS Step Functions to orchestrate the process.
-
Network issues: Ensure your Lambda function has proper network access to reach the remote SFTP server. You might need to configure a VPC for your Lambda function.
-
Credential management: Store SFTP credentials securely using AWS Secrets Manager and rotate them regularly.
-
Concurrency: If you expect high volumes of file uploads, be aware of Lambda concurrency limits and adjust accordingly.
-
Cost: While Lambda can be cost-effective, frequent invocations or long-running functions can increase costs. Monitor usage and optimize as needed.
Alternative approaches:
-
AWS Transfer Family: For a fully managed SFTP solution, you could use AWS Transfer Family to create an SFTP server within AWS, which can directly integrate with S3. This eliminates the need for custom code but may not be suitable if you must use an existing external SFTP server.
-
EC2 instance: For very large files or complex processing needs, you could use an EC2 instance running a scheduled task or service to handle the file transfers.
-
AWS Step Functions: For more complex workflows or if you need to break the process into multiple steps, consider using AWS Step Functions to orchestrate the process.
The Lambda approach is generally the most straightforward and cost-effective for this scenario, especially if you're dealing with moderate file sizes and transfer frequencies. It provides a serverless, event-driven solution that can scale automatically with your needs.
Sources
Community | Secure File Uploads to S3 Using AWS Lambda
Setting up and using SFTP connectors - AWS Transfer Family
Configure SFTP connectors - AWS Transfer Family
Relevant content
- asked 8 months ago
- Accepted Answerasked 2 months ago
- asked 2 years ago
- asked 2 years ago
- AWS OFFICIALUpdated 3 years ago
- AWS OFFICIALUpdated 4 months ago
- AWS OFFICIALUpdated a year ago
- AWS OFFICIALUpdated 3 years ago
Many thanks for your help, this looks good.
I have created a SFTP Connector and successfully tested it with listing a directory, and upload and downloading files using the CLI command. Many thanks !
Now this needs to be incorporated into a lambda function. I have created a lambda function which is triggered by an event in the S3 bucket which works well. I am now searching for example Lambda code of using the fully managed SFTP Connector to carry out the same tasks. Any pointers welcome ! Many thanks
You may want to check out the example in modules of this workshop - https://catalog.us-east-1.prod.workshops.aws/workshops/e55c90e0-bbb0-47e1-be83-6bafa3a59a8a/en-US
It provides a reference architecture (and sample lambda functions) for invoking connector operations, though the orchestration is done via Step Functions in this case.
Many thanks I have tried to do the work shop but running the template following the instructions at https://catalog.us-east-1.prod.workshops.aws/workshops/e55c90e0-bbb0-47e1-be83-6bafa3a59a8a/en-US/10-environment-setup/30-non-aws-event gives the following error :
Template format error: Unrecognized resource types: [AWS::B2BI::Transformer]
Is it because I am running Free Tier account for training ?