What is the recommended way to copy an entire S3 bucket in Step Functions, including large files, so that size limits and timeouts won't cause problems?

0

What is the recommended way to copy an entire S3 bucket (non-versioned) in Step Functions, including large files, so that size limits and timeouts won't cause problems?

wab
已提问 1 年前918 查看次数
1 回答
2

S3 buckets can have a virtually unlimited size. This means the copy operation could take some time. To perform an S3 bucket to bucket copy asynchronously, I would look at using a Lambda function to call an S3 batch operation in your StepFunction.

S3 Batch Operations are an Amazon S3 data management feature that lets you manage billions of objects at scale with just a single API request. More on S3 Batch Operations here . . .

https://aws.amazon.com/s3/features/batch-operations/

profile pictureAWS
已回答 1 年前
profile pictureAWS
专家
已审核 1 年前
  • Thanks @tedtrent. Does this approach require the S3 batch operation to be created in advance (with source and destination buckets predefined)? Or can the bucket references still be dynamic, based on the input parameters of the step function?

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则