Using AWS Glue to export ~500TB of DynamoDB table to S3 bucket

0

We have use case where we want to export ~500TB of DynamoDb data to a S3, one of the possible approaches that I found was making use of AWS Glue Job. Also while exporting the data to S3, we need to perform certain kind of transformation in the DynamoDB data for which we need to make a service call to Java package (transformation logic isn't that heavy and should get completed in ms). Is AWS glue a better approach to export ~500TB of data

shasnk
質問済み 4ヶ月前282ビュー
2回答
0

You can use AWS Batch for exporting and transforming teh 500TB of data from DynamoDB to an S3 bucket.

  • Start by using the native export functionality of DynamoDB to export your data directly to an S3 bucket. This approach is highly efficient for large datasets and does not impact the performance of your DynamoDB table.
  • Develop a Docker container with your transformation logic and upload it to Amazon ECR. Then, configure an AWS Batch environment specifying the necessary compute resources. Then, define job definitions in AWS Batch, detailing how jobs should run using your container. Last, submit transformation jobs to AWS Batch to process data from S3 and store the transformed data back to S3 or another location.
  • Optionally, use AWS Step Functions to manage the workflow, particularly if the process involves multiple steps.

If this has resolved your issue or was helpful, accepting the answer would be greatly appreciated. Thank you!

profile picture
エキスパート
回答済み 4ヶ月前
0

Hi,

Yes, Glue is the ETL service on AWS for such tasks: it allows to process / transform data as you export from DDB to S3.

Here is a good article detailling how to do it: https://dev.to/ritaly/how-to-export-aws-dynamodb-data-to-s3-for-recurring-tasks-4l47

Best,

Didier

profile pictureAWS
エキスパート
回答済み 4ヶ月前

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ