What is the recommended way to copy an entire S3 bucket in Step Functions, including large files, so that size limits and timeouts won't cause problems?

0

What is the recommended way to copy an entire S3 bucket (non-versioned) in Step Functions, including large files, so that size limits and timeouts won't cause problems?

wab
已提問 1 年前檢視次數 917 次
1 個回答
2

S3 buckets can have a virtually unlimited size. This means the copy operation could take some time. To perform an S3 bucket to bucket copy asynchronously, I would look at using a Lambda function to call an S3 batch operation in your StepFunction.

S3 Batch Operations are an Amazon S3 data management feature that lets you manage billions of objects at scale with just a single API request. More on S3 Batch Operations here . . .

https://aws.amazon.com/s3/features/batch-operations/

profile pictureAWS
已回答 1 年前
profile pictureAWS
專家
已審閱 1 年前
  • Thanks @tedtrent. Does this approach require the S3 batch operation to be created in advance (with source and destination buckets predefined)? Or can the bucket references still be dynamic, based on the input parameters of the step function?

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南