What is the recommended way to copy an entire S3 bucket in Step Functions, including large files, so that size limits and timeouts won't cause problems?

0

What is the recommended way to copy an entire S3 bucket (non-versioned) in Step Functions, including large files, so that size limits and timeouts won't cause problems?

1개 답변
2

S3 buckets can have a virtually unlimited size. This means the copy operation could take some time. To perform an S3 bucket to bucket copy asynchronously, I would look at using a Lambda function to call an S3 batch operation in your StepFunction.

S3 Batch Operations are an Amazon S3 data management feature that lets you manage billions of objects at scale with just a single API request. More on S3 Batch Operations here . . .

https://aws.amazon.com/s3/features/batch-operations/

profile pictureAWS
답변함 일 년 전
profile pictureAWS
전문가
검토됨 일 년 전
  • Thanks @tedtrent. Does this approach require the S3 batch operation to be created in advance (with source and destination buckets predefined)? Or can the bucket references still be dynamic, based on the input parameters of the step function?

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠