What is the recommended way to copy an entire S3 bucket in Step Functions, including large files, so that size limits and timeouts won't cause problems?

0

What is the recommended way to copy an entire S3 bucket (non-versioned) in Step Functions, including large files, so that size limits and timeouts won't cause problems?

wab
asked a year ago905 views
1 Answer
2

S3 buckets can have a virtually unlimited size. This means the copy operation could take some time. To perform an S3 bucket to bucket copy asynchronously, I would look at using a Lambda function to call an S3 batch operation in your StepFunction.

S3 Batch Operations are an Amazon S3 data management feature that lets you manage billions of objects at scale with just a single API request. More on S3 Batch Operations here . . .

https://aws.amazon.com/s3/features/batch-operations/

profile pictureAWS
answered a year ago
profile pictureAWS
EXPERT
reviewed a year ago
  • Thanks @tedtrent. Does this approach require the S3 batch operation to be created in advance (with source and destination buckets predefined)? Or can the bucket references still be dynamic, based on the input parameters of the step function?

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions