From S3 bucket to SFTP server

0

Hi,

   Requirement is to send a file from S3 bucket to an SFTP server (which has already been configured in our AWS Transfer Family). In documentation, I read that AWS transfer family is "a secure transfer service that enables you to transfer files into and out of AWS storage services." 
    But, how to do that ? Is it possible to configure AWS transfer family to receive a file from S3 bucket ? Or, must I configure S3 bucket to enable sending a file to an SFTP server ( https://docs.aws.amazon.com/transfer/latest/userguide/transfer-file.html#post-processing-S3-object-metadata ) ?

     Any help or working example is highly appreciated. I lost several days trying to use "Paramiko" library in a lambda function or even a Glue job and I haven't succeeded to install that library. Before using ( https://stackoverflow.com/questions/47905614/upload-s3-bucket-csv-file-to-sftp-server-in-nodejs ) to do that, I thought I may ask experts over here who may know how do it in a right and efficient way.

Thanks beforehand,

Anthony

tonyAWS
asked 2 years ago7337 views
2 Answers
2

SFTP server's storage is the S3 bucket. Ie. "to send a file from S3 bucket to an SFTP server" you need to do nothing but to upload the object into the same bucket as you configured for the SFTP user. If you would want to copy automatically objects from another bucket to one configured for SFTP, then you want to look into triggering a lambda function when object is uploaded (or modified) in your source bucket.

Please note that your SFTP users root directory is by default s3://bucket-name/user-name/

For more complex directory mappings, logical directories can be helpful

profile picture
EXPERT
Kallu
answered 2 years ago
1

Hello,

In regards to copying S3 bucket data to SFTP server, these are few options to do this -

  1. Use S3 Event notifications. S3 event notification is a mechanism to trigger a notification to different AWS destinations when a certain event happens in a s3 bucket. For example, you can invoke a Lambda function when there is an upload of new object to s3. With this mechanism, you can trigger a Lambda function upon upload of new object in s3 bucket, this Lambda function will have to take care of authenticating On-premise SFTP server and copy files to this server. This function can assume s3 permissions from IAM role associated to it. You can refer below document to know more about s3 event notification.

https://docs.aws.amazon.com/AmazonS3/latest/userguide/NotificationHowTo.html

Note - In order to upload and Download from s3, you need to have s3:PutObject and s3:GetObject permissions. But, you may need to add more permission depending upon the actions you are doing while trying to access s3. For example, if you are trying to do a GetObject on previous version, you need to use s3:GetObjectVersion. For s3 actions, you can refer below document -

https://docs.aws.amazon.com/AmazonS3/latest/userguide/list_amazons3.html#amazons3-actions-as-permissions

  1. Using shell script: You can setup an automation on an EC2 Instance (or an On-premise server) and copy files from s3 bucket to SFTP server using regular shell commands. The important thing to consider is to authenticate SFTP server. For s3 authentication, you can use IAM instance profile to assume the credentials.

Refer the below document and see if it helps to authenticate SFTP using a shell script. Please note that this is a third party document and AWS doesn't own it, I cannot guarantee any information in this document. I would recommend you to go through in detail before implementing this.

https://www.golinuxcloud.com/automate-sftp-shell-script-with-password-unix/

You could also consider executing shell script in the SFTP server itself. Typically, SFTP server files are stored on local disks and can be accessed directly from the OS itself. If you can automate a script, you can use aws cli s3 command to copy files directly to File system instead of authenticating SFTP server using shell script.

  1. Using AWS Transfer SFTP server that exposes objects in s3 bucket using SFTP protocol. You can authenticate using ssh key pair using supported SFTP client and access files from s3 bucket.

https://aws.amazon.com/aws-transfer-family/

I want to explain a little about the Transfer Family service. The AWS Transfer Family provides fully managed support for file transfers directly into and out of Amazon S3 or Amazon EFS. When you create a SFTP enabled Transfer Family server and create a user that connects to a S3 bucket, then once you connect to this Transfer server via the user from your machine, you can then view all of the current files already in your S3 bucket.

The transfer family server views what is in the S3 bucket and what you could do is download/transfer the files from the S3 bucket into your local machine or you can upload/transfer files from your local machine to your S3 bucket.

For more information on getting started with Transfer Family service, such as creating servers and users, please refer here -

https://docs.aws.amazon.com/transfer/latest/userguide/getting-started.html

https://docs.aws.amazon.com/transfer/latest/userguide/how-aws-transfer-works.html

profile pictureAWS
SUPPORT ENGINEER
Yash_C
answered 2 years ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions