Questions tagged with AWS Transfer Family
Content language: English
Sort by most recent
[Transfer Family](https://aws.amazon.com/aws-transfer-family/) supports S3 and EFS storage targets, but a project's requirement specifies up/downloaded files be available to Windows EC2 instances via SMB (specifically a [FSx](https://aws.amazon.com/fsx/) share). I would let the EC2 instances mount the EFS volume used by Transfer Family, but [Microsoft](https://learn.microsoft.com/en-us/windows-server/storage/nfs/nfs-overview) says Windows NFS clients can only use NFSv2 or NFSv3. Since Transfer Family doesn't [natively](https://aws.amazon.com/aws-transfer-family/features/#Data_stored_natively_in_AWS_Storage_services) support FSx, is [DataSync](https://aws.amazon.com/datasync/) “between AWS storage services” the best way to support this workflow? As an added twist, we'll probably need to support both uploads (items that arrived via SFTP, consumed by the EC2 instances) and downloads (EC2 instance outputs a file, which will then be fetched by a remote user via SFTP) so we'll need to have [bidirectional](https://docs.aws.amazon.com/datasync/latest/userguide/other-use-cases.html#opposite-direction-tasks) DataSync between the EFS and FSx volumes.
Connection logs for troubleshooting "target machine actively refused connection" on Transfer Family SFTP
I am trying to troubleshoot an SFTP connection: I'm getting the error message "target machine actively refused connection". Is there somewhere I can find server logs with details about why the connection was refused. The Cloudwatch logs for transfer family seem to start only after successful login. Other connections seem to work fine, so it seems client related; Just trying to figure out what the client is doing that is causing the connection to be rejected.
Hello Community, I was trying to narrow down to use one of the options to transfer files from the SFTP server to the S3 bucket, so as to help my Glue jobs because AWS Glue doesn’t support data loads from other cloud applications, File Storages. So, here I am trying to choose one, either AWS transfer Family or AWS lambda function that can connect to the remote server, and move them into the S3 bucket/folder which becomes the source of my integrations. I greatly appreciate it if you could share some insights into this scenario, and the advantages, and drawbacks of choosing one over the other. Any bottlenecks that you guys faced in using either of these services before for file transfer? Which is more cost-effective, suppose we say we have gigabytes of data(files). Thank you. Best, Tharun
Is it possible to integrate a transfer family custom identity provider on account A with an API Gateway on Account B using IAM authorizer settings? And how to do it?
AWS Transfer Family client returns an error with no function list_profiles in Lambda but code runs fine locally
When trying to grab the profiles from the AWS Transfer family client, it returns an error saying 'Transfer' object has no attribute 'list_profiles'. However, the same code works fine when I run it locally. It's only in AWS Lambda does it fail. I thought it might've been a permissions issue but the role has full access to Transfer Family. It also has no issues running other functions like describe_server or list_tags_for_resource so it's not like the client is somehow missing. ``` client = boto3.client(service_name = 'transfer', region_name = region) response = client.list_profiles() ``` By the way, I tested it with other profile functions like describe_profile and the same issue with the attribute not existing happens. Here's the documentation I was following: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/transfer.html#Transfer.Client.describe_profile
Hi team, I have a private VPC with all private subnets, I create an sftp server: - Protocols = SFTP - Identity provider = Service managed - VPC = my private VPC - access = Internal - Domain = Amazon S3 the objective is to allow the other team from the same corporate to load files into my s3 bucket. when I finish creating the sftp server, it doesn't give me an endpoint ==> (Endpoint = '-' and Custom hostname = '_') I just want to know how the other team from the same corporate can interact with the sftp server to put files on my bucket as my sftp server is not publically accessible and I don't have an endpoint URL to give them. so how can they connect to my server to put files? can they use clients like FileZilla or putty or winSCP ... to transfer files? Thank you!
I have an SFTP Transfer Family server in front of an EFS volume. I recently noticed that any new files/directories that are created are given world writeable permissions. ![Enter image description here](/media/postImages/original/IMACHIabpuR123e5bnwVrnfg) Is there a way to change the default permissions for uploaded files?
I have deployed a Transfer Family sftp server (using an Amazon EFS ). I am having trouble configuring the user. I keep getting the error: Failed to create user (Unsupported or invalid SSH public key format) I have tried using the format according to AWS format but still, get the error. Has anyone had this issue and how did you solve it?
When setting up the Transfer Family for AS2, I'm running into an error with receiving a message. After using this guide (https://docs.aws.amazon.com/transfer/latest/userguide/as2-end-to-end-example.html#as2-create-certs) to create the certificates, I tried to set up the AS2 Transfer Family AS2 to receive messages. The VPC is created and the endpoint can be reached. However, when actually sending the message, a 400 error Bad Request is returned with no other information. On the console, there's no record of data going in or out. Is there a way to view more information? Also, just to confirm, when the guide says to send public keys, that's the signing-cert.pem/encrypting-cert.pem, correct? I had that set up in the partner and there's no error but I just want to make sure that it's not an authentication issue. By the way, using this guide (https://docs.aws.amazon.com/transfer/latest/userguide/as2-end-to-end-example.html#as2-test-config), when using the link format in Step 7, I'm not able to connect. The endpoint connection is actually http://s-1234567890abcdef0.SERVER.transfer.us-east-1.amazonaws.com:5080. The link is correct in the server configuration but the guide is incorrect.
Using CyberDuck to login in a user into Transfer Family using its own identity service. Is there a way to track if a user's authenticated key failed? Or how many login attempts they made? I only know that you can track data transfer info [here](https://docs.aws.amazon.com/transfer/latest/userguide/monitoring.html) but it's not what I'm looking for. Thanks
Hi, Requirement is to send a file from S3 bucket to an SFTP server (which has already been configured in our AWS Transfer Family). In documentation, I read that AWS transfer family is "a secure transfer service that enables you to transfer files into and out of AWS storage services." But, how to do that ? Is it possible to configure AWS transfer family to receive a file from S3 bucket ? Or, must I configure S3 bucket to enable sending a file to an SFTP server ( https://docs.aws.amazon.com/transfer/latest/userguide/transfer-file.html#post-processing-S3-object-metadata ) ? Any help or working example is highly appreciated. I lost several days trying to use "Paramiko" library in a lambda function or even a Glue job and I haven't succeeded to install that library. Before using ( https://stackoverflow.com/questions/47905614/upload-s3-bucket-csv-file-to-sftp-server-in-nodejs ) to do that, I thought I may ask experts over here who may know how do it in a right and efficient way. Thanks beforehand, Anthony
Hello Team, I am working on a AWS Transfer Family Solution (SFTP) and need a confirmation that whether this service can support both password and ssh key based authentication at same time (i.e in one login attempt when user passes both using any sftp client like filezilla or winscp). I used lambda based identity provider and identified that when I pass both password and ssh key in Filezilla, password is never passed to lambda and so code logic have to assume it is ssh key based authentication. Can someone please provide any advise !!