In need of exporting dynamodb table into csv



I want to automate the csv exporting so my team can access the csv file which i would store into an S3 bukkit. I find many solutions about how to import csv files into dynamo but none of them are giving me the answer for exporting to csv. I know there is a "export to csv" button in dynamodb and that's exactly how i want it but then automated. I've been trying with lambda , glue , Exports and streams ,... None of them seem to work or getting stuck because can't find any good documentation. Can somebody pleas help me out?


1 Answer

One such way to do it programmatically is doing a Scan operation and formatting the data to CSV:

   aws dynamodb scan \
  --table-name <table_name> \
  --select ALL_ATTRIBUTES \
  --page-size 500 \
  --max-items 10000 \
  --output json | jq -r '.Items' | jq -r '(.[0] | keys_unsorted) as $keys | $keys, map([.[ $keys[] ][]?])[] | @csv' > my-table-3.csv

Another way is using AWS Datapipeline to export the table to CSV and stores it directly on S3:

And finally you can use AWS Glue to read the DynamoDB table and write to S3 in CSV format:

There is also third party options available, many of which you must pay for such as DynoBase:

profile picture
answered 11 days ago
profile picture
reviewed 11 days ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions