AWS Transfer Python Coding Question

1

I'm coding a python script to backup my computer to my S3 bucket. However, I just saw restrictions on how many steps. etc. in a workflow for the transfer family as follows:

How do I backup my data using python to my S3 bucket on my account? I'd also like to learn how to code in python to access AWS services so that's the main reason I'd like to do this. I've created some code already to create a transfer server, start it, stop it and delete it.

Limitations Additionally, the following functional limits apply to workflows for Transfer Family: The number of workflows per account is limited to 10. The maximum timeout for custom steps is 30 minutes. The maximum number of steps in a workflow is 8.

2 Answers
1
Accepted Answer

Hi there,

Consider using S3 APIs directly instead of AWS Transfer

The AWS Transfer Family of services is typically a better fit when the object sender is unable to use S3 REST APIs to write directly to a bucket and instead must use protocols like SFTP (common in B2B scenarios).

While you can certainly use AWS Transfer services, it doesn't sound like you have a hard requirement to do so. Instead, it seems you're just looking for good ways to learn Python programming and cloud development. If that's the case, I would suggest you instead use the AWS SDK for Python (boto3) to read from /write to S3 directly. I think you'll find this much easier to code than putting the AWS Transfer service in the middle of things.

The other big benefit of using S3 APIs directly is that writing files directly to S3 is going to be less expensive. The two biggest reasons are:

  1. AWS Transfer creates and manages a transfer endpoint for you. You pay an hourly cost for that endpoint even if it is not in use. Writing to S3 requires no such endpoint, thus no hourly cost.

  2. AWS Transfer charges $0.04 per gigabyte transferred in either direction. If you use S3 APIs, there are no data transfer costs when writing to S3 (there is a small $0.005 per PutObject API call, though), and reading from S3 typically starts at ~$0.02 per gigabyte.

Refer to AWS's pricing pages for latest pricing:

AWS Transfer workflow limits and alternatives

Many (but not all) of AWS service limits are adjustable. These limits are sometimes referred to as "service quotas", and these docs show that the limit of 10 workflows per account can be raised. I don't see any mention of steps per workflow, so that may be a hard limit. See here for how to request a limit increase.

I don't know what you're trying to accomplish with the AWS Transfer Workflows, but if you decide to send data to S3 directly, there are still ways to build your own workflows... a really common pattern is to create an S3 Event Notification. You can configure an S3 event to trigger when one or more of certain events, like object put or object delete, occur. You can choose the send the event to one of several AWS services (like AWS Lambda, SNS, SQS, or EventBridge) that you can use to build your own custom automation/workflow.

Python examples

There are lots of examples of Python+S3 out there, with a few examples below:

Pricing Notes

Any reference to pricing is based on a quick review of the us-west-2 rates from AWS public documentation. AWS prices can vary by region and some services also offer automatic tiering (discounts) for high-volume usage.

mwrb
answered 2 years ago
  • Thanks so much. I really appreciate the help.

1

You mentioned "I'd also like to learn how to code in python to access AWS services so that's the main reason I'd like to do this.". There is an AWS Workshop named Learn Python On AWS Workshop.

From the workshop blurb "This workshop will teach you the basics of the python programming language using Amazon Web Services (AWS). It is aimed at beginners who have never programmed in python and it uses similar methods of explaining the basics as other books and tutorials on the python programming language."

RoB
answered 2 years ago
  • Thanks so much. This is just what I've been looking for. Like I said, I want to learn how to access AWS with python and this sounds perfect for that.

  • Are there any more advanced classes on specific services I can take? That class was too basic. Thanks for the help, though.

  • On second thought, I'll use this re:Post area to find out what service to use and then use the boto3 documentation to learn how to create the code. I think that's best for me. Sorry for my indecision. I'm an experience programmer and I think that's best for me at least.

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions