Questions tagged with AWS Transfer for SFTP

Content language: English

Sort by most recent

Browse through the questions and answers listed below or filter and sort to narrow down your results.

How to connect to AWS Transfer SFTP endpoint from an EC2 instance? Currently from EC2, the connection is interrupted with Connection reset by peer message: sftp -i <privatekey> -v user@<endpoint>.server.transfer.ap-southeast-2.amazonaws.com OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017 debug1: Reading configuration data /etc/ssh/ssh_config debug1: /etc/ssh/ssh_config line 58: Applying options for * debug1: Connecting to <endpoint>.server.transfer.ap-southeast-2.amazonaws.com [ServerPrivateIp] port 22. debug1: Connection established. debug1: permanently_set_uid: 0/0 debug1: key_load_public: No such file or directory debug1: identity file sftp_id_rsa type -1 debug1: key_load_public: No such file or directory debug1: identity file sftp_id_rsa-cert type -1 debug1: Enabling compatibility mode for protocol 2.0 debug1: Local version string SSH-2.0-OpenSSH_7.4 debug1: Remote protocol version 2.0, remote software version AWS_SFTP_1.1 debug1: no match: AWS_SFTP_1.1 debug1: Authenticating to <endpoint>.server.transfer.ap-southeast-2.amazonaws.com:22 as 'user' debug1: SSH2_MSG_KEXINIT sent Connection closed by ServerPrivateIp port 22 Couldn't read packet: Connection reset by peer
0
answers
0
votes
8
views
asked 4 days ago
I change the permission with SSH (very simple just "chown -R ec2-user:apache /var/www/html/xxxxx/xxxx/xxx"), the SSH reject my connection and ssh: connect to host xxx.xxx.xxx.xxx port 22: Connection refused... After that I check and reboot the EC2 but not effective... What can I do...
1
answers
0
votes
21
views
asked 12 days ago
Hey all! I setup an SFTP and generated and used PuTTY Key Generator for the SSH Public key. As per guidance on the AWS site for transferring a file, I downloaded Cyberduck and followed the instructions. I pasted the endpoint into the 'Server' field. used 22 for 'Port number', entered my username, and pasted my SSH private key. However it also asks for a password. The only password i recall setting up was the one for the public key, however when i enter this and select 'Connect' in Cyberduck, i receive the message "Listing Directory Failed. Failed to create user (Unsupported or invalid SSH public key format)". Can anyone advise what i'm doing wrong please? Thank you!
1
answers
0
votes
9
views
profile picture
asked 25 days ago
Hi all! I've managed to set up an S3 bucket and an SFTP Server. Upon adding users to the server, I am prompted to enter an SSH public key. I have used PuTTY Key Generator on my laptop to generate the keys (both private and public). However, when i try to paste the SSH public key, I receive the message 'Failed to create user (Unsupported or invalid SSH public key format)'. Can anyone please advise what I could be doing wrong? TIA
2
answers
0
votes
20
views
profile picture
asked a month ago
I'm attempting to set up permissions for a user account on AWS Transfer Service with SFTP protocol. I have a use case where a user should be able to add a file to a directory but not list the files in it. When I tweak the IAM role to deny 's3:ListBucket' for a specific subdirectory the put operation fails as well. Theoretically s3 does allow to Put object without having the ability to list the prefixes. AWS transfer service however seems to be implicitly using the list bucket operation before put. Has anyone managed to deny listing ability while still being able to upload. IAM policy : ``` { "Version": "2012-10-17", "Statement": [ { "Action": "s3:ListBucket", "Effect": "Allow", "Resource": [ "arn:aws:s3:::<my-bucket>" ], "Sid": "AllowListDirectories", "Condition": { "StringLike": { "s3:prefix": [ "data/partner_2/*" ] } } }, { "Sid": "DenyMkdir", "Action": [ "s3:PutObject" ], "Effect": "Deny", "Resource": "arn:aws:s3:::<my-bucket>/*/" }, { "Sid": "DenyListFilesInSubDirectory", "Action": [ "s3:ListBucket" ], "Effect": "Deny", "Resource": "arn:aws:s3:::<my-bucket>", "Condition": { "StringLike": { "s3:prefix": [ "data/partner_2/data/incoming/*" ] } } }, { "Effect": "AllowReadWirteInSubDirectory", "Action": [ "s3:GetObject", "s3:GetObjectVersion", "s3:PutObject", "s3:PutObjectAcl", "s3:PutObjectTagging", "s3:PutObjectVersionAcl", "s3:PutObjectVersionTagging" ], "Resource": "arn:aws:s3:::<my-bucket>/data/partner_2/data/incoming/*" }, { "Effect": "AllowOnlyReadInADifferentDirectory", "Action": [ "s3:GetObject", "s3:GetObjectVersion" ], "Resource": "arn:aws:s3:::<my-bucket>/data/partner_2/data/outgoing/*" } ] } ``` The output from SFTP client: ``` sftp> cd data/incoming sftp> ls Couldn't read directory: Permission denied sftp> put /Users/foo/Downloads/test.log Uploading /Users/foo/Downloads/test.log to /data/incoming/test.log remote open("/data/incoming/test.log"): Permission denied sftp> get test-one.txt Fetching /data/incoming/test-one.txt to test-one.txt sftp> exit ```
1
answers
0
votes
41
views
asked a month ago
Hi! Can anyone help direct me to someone i could speak to about setting up an SFTP. I think this is what i need, but i'm not 100% sure and would appreciate some technical advice. In short, my team (part of Amazon) want to regularly share data with a third party to be ingested into a dashboard. The third party suggest we use an SFTP. Thanks in advance! Lyndsey
1
answers
0
votes
38
views
asked 2 months ago
hello all may i ask you all, i found a problem like in the picture. ![Enter image description here](/media/postImages/original/IMmcesqKhAQPiLJUQCYLvS3g) **In order to perform the requested action, WordPress needs to access your web server. Please enter your FTP credentials to continue. If you don't remember your credentials, we recommend contacting your web host.** can you please help me to fix this all Thank you
2
answers
0
votes
52
views
Taufiq
asked 3 months ago
I need to delete a few files on the server, only way I know to get to them is via ftp. when I try to delete those files I get permission denied message. I am logged in with bitnami username. Is it possible to fix ?
1
answers
0
votes
24
views
asked 3 months ago
I have an SFTP server. I can login and see directories, and partner A can as well , but partner B can not. . Im using the same same roles and policies. but partner A get access denied after connecting.
1
answers
0
votes
29
views
asked 3 months ago
Hi, I am trying to setup AWS File transfer SFTP server. Here is my requirement: 1. User must be authenticated via third part identity provider which in Azure Authentication in our case. 2. Once user logged in they should two folder in their homedirectory i.e. {transfer:user}/folder1 and {transfer:user}/folder2 3. User should be restricted to put files in either folder1 or folder2, not in their home directory. 4. User should be able download the files only if specific tag is set on object/files in S3 So far, I am able to achieve Step 1 and Step 2 -- Step 1 -- custom authentication using lambda. Step 2 -- Once user authenticated successfully, Lambda will create folder1 and folder2 in their homedirectory. But when user logged into their home-directory they are not able to see folder1 and folder2 in their homedirectory but I can see folders were created successfully in S3 bucket. Here is IAM role attached to Transfer server and not able to figure out what's wrong with it. Any help would be appreciate. ``` { "Version": "2012-10-17", "Statement": [ { "Sid": "ReadWriteS3", "Effect": "Allow", "Action": [ "s3:ListBucket", "s3:GetBucketLocation" ], "Resource": [ "arn:aws:s3:::s3-bucket" ] }, { "Sid": "HomeDirObjectAccess", "Effect": "Allow", "Action": [ "s3:PutObject" ], "Resource": [ "arn:aws:s3:::s3-bucket/*" ] }, { "Action": [ "s3:GetObject", "s3:GetObjectVersion" ], "Condition": { "StringEquals": { "s3:ExistingObjectTag/allowdownload": "yes" } }, "Resource": [ "arn:aws:s3:::s3-bucket/*" ], "Effect": "Allow", "Sid": "DownloadAllowed" }, { "Action": [ "s3:GetObject", "s3:GetObjectVersion" ], "Condition": { "StringEquals": { "s3:ExistingObjectTag/allowdownload": "no" } }, "Resource": [ "arn:aws:s3:::s3-bucket/*" ], "Effect": "Deny", "Sid": "DownloadNotAllowed" }, { "Sid": "DenyMkdir", "Effect": "Deny", "Action": "s3:PutObject", "Resource": "arn:aws:s3:::s3-bucket/*/*/" } ] } ``` Within lambda where user authentication happens, I am returning user's homedirectory ``` HomeDirectoryDetails = [{"Entry":"/","Target":"/s3-bucket/${transfer:UserName}"}] ``` also tried below but no luck ``` HomeDirectoryDetails = = [{"Entry":"/folder1","Target":"/s3-bucket/${transfer:UserName}/folder1"},{"Entry":"/folder2","Target":"/s3-bucket/${transfer:UserName}/folder2"}] ``` User gets permission denied error when try to do "ls" in their home directory ``` sftp> ls Couldn't read directory: Permission denied ```
1
answers
0
votes
73
views
asked 3 months ago
I cant login to sftp using my private key on my instance? ami-0fec1fb452e2ab3b0 with ubuntu as username do i need an sftp server or something?
2
answers
0
votes
68
views
Loot
asked 5 months ago
Hi All, We have setup AWS file transfer server with AWS directory service (connected to Microsoft AD) authentication. As per use case, once user login to sftp, user should be able to see two directory within their own folder. {username}/folder1 {username}/folder2 I have setup below Access policy and IAM policy (attached to S3) create-access CLI: ``` aws transfer create-access \ --home-directory-type LOGICAL \ --home-directory-mappings '[{"Entry":"/folder1","Target":"/bucket_name/${transfer:UserName}/folder1" },{ "Entry": "/folder2", "Target":"/bucket_name/${transfer:UserName}/folder2"}]' \ --role arn:aws:iam::account_id:role/iam_role \ --server-id s-1234567876454ert \ --external-id S-1-2-34-56789123-12345678-1234567898-1234 ``` access policy was created successfully. Below IAM role is attached to S3 bucket and file-transfer server. ``` { "Version": "2012-10-17", "Statement": [ { "Action": [ "s3:ListBucket", "s3:GetBucketLocation" ], "Resource": [ "arn:aws:s3:::bucket_name" ], "Effect": "Allow", "Sid": "ReadWriteS3" }, { "Action": [ "s3:PutObject", "s3:GetObject", "s3:DeleteObject", "s3:DeleteObjectVersion", "s3:GetObjectVersion", "s3:GetObjectACL", "s3:PutObjectACL" ], "Resource": [ "arn:aws:s3:::bucket_name/${transfer:UserName}/*" ], "Effect": "Allow", "Sid": "" } ] } ``` When user login to sftp, they do not see folder1 & folder2 in their own directory. Can anyone help if anything missing in IAM policy? Thank You
3
answers
0
votes
130
views
profile picture
asked 5 months ago