What is the recommended way to copy an entire S3 bucket in Step Functions, including large files, so that size limits and timeouts won't cause problems?

0

What is the recommended way to copy an entire S3 bucket (non-versioned) in Step Functions, including large files, so that size limits and timeouts won't cause problems?

wab
preguntada hace un año919 visualizaciones
1 Respuesta
2

S3 buckets can have a virtually unlimited size. This means the copy operation could take some time. To perform an S3 bucket to bucket copy asynchronously, I would look at using a Lambda function to call an S3 batch operation in your StepFunction.

S3 Batch Operations are an Amazon S3 data management feature that lets you manage billions of objects at scale with just a single API request. More on S3 Batch Operations here . . .

https://aws.amazon.com/s3/features/batch-operations/

profile pictureAWS
respondido hace un año
profile pictureAWS
EXPERTO
revisado hace un año
  • Thanks @tedtrent. Does this approach require the S3 batch operation to be created in advance (with source and destination buckets predefined)? Or can the bucket references still be dynamic, based on the input parameters of the step function?

No has iniciado sesión. Iniciar sesión para publicar una respuesta.

Una buena respuesta responde claramente a la pregunta, proporciona comentarios constructivos y fomenta el crecimiento profesional en la persona que hace la pregunta.

Pautas para responder preguntas