Skip to content

Continuous DynamoDB backup in Backup Vault by using incremental export to S3 and continuous S3 backup to Vault - will that work?

1

Backup Vault doesn't currently support PITR (continuous) backups for DynamoDB. At the same time DynamoDB supports incremental export to S3 and Backup Vault supports continuous backup of S3 buckets.

Can I combine the 2 to achieve continuous DynamoDB backup in Backup Vault?

  1. Run incremental export from DynamoDB to a S3 bucket
  2. Run continuous backup of that S3 bucket to Backup Vault

I understand this will not give me PITR from Vault - this is mostly a data loss prevention measure / compliance requirement.

asked a year ago276 views
1 Answer
0

Greeting

Hi Maciej!

Thank you for your detailed question about achieving a reliable backup strategy for DynamoDB using S3 and AWS Backup. You’ve laid out a thoughtful approach to ensure data protection and compliance despite the absence of direct PITR support in Backup Vault. Let’s explore how you can effectively combine these components for a robust solution.


Clarifying the Issue

You’re seeking a way to implement a continuous backup strategy for DynamoDB using incremental exports to S3 and continuous S3 backups to AWS Backup Vault. While this approach doesn’t provide PITR directly in Backup Vault, you’re addressing compliance and data loss prevention requirements. Combining DynamoDB’s PITR feature with your strategy can provide both short-term recovery and long-term retention. Let’s craft a practical and efficient solution tailored to these goals. 🚀


Why This Matters

Reliable backups are essential for protecting critical data, ensuring compliance, and enabling disaster recovery. PITR offers granular, rapid recovery for operational errors, while S3 and Backup Vault provide long-term durability and retention. For example, in a compliance audit requiring historical data or a disaster recovery scenario due to accidental deletions, this strategy ensures resilience and keeps your systems operational with minimal data loss.


Key Terms

  • Point-in-Time Recovery (PITR): A DynamoDB feature enabling recovery of table data to any second within the last 35 days.
  • Incremental Export to S3: A DynamoDB feature exporting table data to an S3 bucket for long-term retention or compliance.
  • AWS Backup Vault: A secure storage resource in AWS Backup for managing snapshots of supported services, including S3 buckets.
  • S3 Lifecycle Policies: A feature to manage the cost and duration of data storage in S3 by transitioning objects between storage classes.

The Solution (Our Recipe)

Steps at a Glance:

  1. Enable Point-in-Time Recovery (PITR) on your DynamoDB table.
  2. Schedule incremental exports from DynamoDB to an S3 bucket.
  3. Configure continuous S3 backups to AWS Backup Vault.
  4. Monitor and validate backups to ensure reliability.

Step-by-Step Guide:

  1. Enable PITR for the DynamoDB Table:
    • PITR ensures you can recover the table to any second within the last 35 days.
    • Enable PITR using this command:
      aws dynamodb enable-point-in-time-recovery \
        --table-name MyDynamoDBTable \
        --point-in-time-recovery-specification PointInTimeRecoveryEnabled=true
    • To restore to a specific time:
      aws dynamodb restore-table-to-point-in-time \
        --source-table-name MyDynamoDBTable \
        --target-table-name RestoredDynamoDBTable \
        --restore-date-time 2025-01-16T12:00:00Z

  1. Schedule Incremental Exports to S3:
    • Automate exports to an S3 bucket to ensure data retention beyond 35 days.
    • Example export command:
      aws dynamodb export-table-to-point-in-time \
        --table-name MyDynamoDBTable \
        --s3-bucket-name my-s3-backup-bucket \
        --export-time $(date -u +'%Y-%m-%dT%H:%M:%SZ')
    • Use S3 lifecycle policies to reduce costs for older data:
      • Transition objects to Glacier or Deep Archive for long-term storage.

  1. Set Up Continuous Backups of the S3 Bucket:
    • Use AWS Backup to continuously back up the S3 bucket and store snapshots in Backup Vault.
    • Create a backup plan:
      aws backup create-backup-plan --backup-plan '{
        "BackupPlanName": "S3BackupPlan",
        "Rules": [{
          "RuleName": "ContinuousS3Backup",
          "TargetBackupVaultName": "MyBackupVault",
          "ScheduleExpression": "cron(0 0 * * ? *)",
          "StartWindowMinutes": 60,
          "CompletionWindowMinutes": 180
        }]
      }'
    • Assign the S3 bucket to the backup plan:
      aws backup create-backup-selection \
        --backup-plan-id <BackupPlanID> \
        --backup-selection '{
          "SelectionName": "S3BucketBackupSelection",
          "IamRoleArn": "arn:aws:iam::123456789012:role/AWSBackupRole",
          "Resources": ["arn:aws:s3:::my-s3-backup-bucket"]
        }'

  1. Monitor and Validate Backups:
    • Regularly test PITR restores for quick recovery needs.
    • Validate S3 exports for data integrity using checksums.
    • Monitor AWS Backup Vault to confirm snapshots are created:
      aws backup list-recovery-points-by-backup-vault \
        --backup-vault-name MyBackupVault

Closing Thoughts

This strategy combines DynamoDB’s PITR with S3 exports and Backup Vault snapshots, creating a robust solution that balances short-term recovery with long-term retention. For example, if an audit requires data from 6 months ago, your S3 exports and Backup Vault snapshots provide compliance-friendly storage, while PITR ensures you can handle day-to-day recovery needs effortlessly.

For further reference:


Farewell

I hope this enhanced guide gives you clarity, Maciej, and meets your compliance and data resilience goals. Let me know if you have any follow-up questions or need help fine-tuning this further! 🚀 😊


Cheers,

Aaron 😊

answered a year ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.