How can I migrate my DynamoDB tables from one AWS Account to another?
I want to copy data in my Amazon DynamoDB table to a new one in another account in the same or a different account
Short Description
You have several methods that you can use to transfer data from a DynamoDB table in one AWS account to another. The best choice can depend on several factors, including data volume, the need for real time updates, and the complexity of data transformations.
To migrate DynamoDB tables from one AWS Account to another, choose one of the following options:
- AWS Backup
- DynamoDB export to Amazon Simple Storage Service (Amazon S3) and import
- Amazon EMR
- AWS Data Pipeline
- Amazon S3 and AWS Glue
- Custom export and import script
Resolution
Based on your use case, complete the following tasks.
AWS Backup
For source and destination AWS accounts in the same AWS Organizations organization, AWS Backup can perform cross-Region and cross-account DynamoDB data transfers. For more information, see Creating backup copies across AWS accounts.
In the target account, complete the following steps:
- Create an AWS Backup vault in the target account in the AWS Region where your DynamoDB is present. When you create the vault, use the AWS Key Management Service (AWS KMS) key that you configured. This is the key that was shared with the source account in the same organization.
- Add an AWS Identity and Access Management (IAM) policy to the vault that allows you to back up to the vault. To do this, select the option Allow access to a backup vault from organization. This allows other accounts within the same organization to copy into the vault.
In the source account, complete the following tasks:
- Create an AWS Backup vault in the Region where you need to migrate your table data to. When you create the vault, use the AWS KMS key that you already configured. This is the key that was shared with other accounts in the organization.
- Add an IAM policy to the vault that allows other accounts in the organization to copy into the vault. To do this, select the option Allow access to a backup vault from organization.
- Create a backup plan to generate backups of the DynamoDB tables in the source account to the target account. For Backup vault, choose the vault that you created in the source account.
- Select the Copy to another account's vault option.
- For Assign resources, choose the Include specific resource types option to include the resources that you must back up.
- For Select specific resource types, select DynamoDB. Then, choose all tables or only those tables that you must back up.
To review your configuration and restore a table, complete the following steps:
- In the target account, navigate to the vault that you created. You can see that the Recovery points are the same as in the source account.
- Restore your DynamoDB table in the target account.
Note: This option is supported only for accounts that are part of the same AWS Organization.
DynamoDB export to Amazon S3 and import
Use the DynamoDB export to Amazon S3 feature to export data from an Amazon DynamoDB table at any point within your point-in-time recovery window. For an example of how to use this feature, see Export Amazon DynamoDB table data to your data lake in Amazon S3, no code writing required.
To use the DynamoDB export to Amazon S3 feature, complete the following steps:
- To migrate the DynamoDB table data, export the table to an Amazon S3 bucket in the target account.
Note: DynamoDB must have s3:ListBucket permissions to this S3 bucket. The S3 bucket can't have any access control lists that deny access to the exported data. - Import the data from the S3 bucket into a new table in the target account.
Note: This option requires you to set up and manage AWS Data Pipeline, cross-account permissions, and IAM roles.
Amazon EMR
To use Amazon EMR to export your data to an S3 bucket, use on of the following methods:
- Use DynamoDBStorageHandler to run Hive or Spark queries against DynamoDB tables. For more information, see Exporting data from DynamoDB.
- To export or import DynamoDB tables, use the open-source emr-dynamodb-tool on GitHub.
To use Amazon EMR to migrate a DynamoDB table, complete the following steps:
- Launch EMR clusters in both the source and destination accounts. In the Software configuration section, choose an option that includes Apache Hive.
Note: It's a best practice to launch Amazon EMR clusters into private subnets. The private subnets must have an Amazon S3 VPC endpoint and a route to DynamoDB. If the clusters must access the internet, then use a NAT gateway that resides in a public subnet. For more information, see VPC with servers in private subnets and NAT. - Update the EMR_EC2_DefaultRole IAM roles in both accounts to have permission to write to the S3 bucket in the destination account. For more information, see Configure IAM service roles for Amazon EMR permissions to AWS services and resources.
- In the source account, use SSH to connect to the leader node.
- In the source account, use Hive commands to export the DynamoDB table data to the S3 bucket in the destination account.
- In the destination account, import the Amazon S3 data to the new DynamoDB table.
Note: If you use a staging table to capture writes that happen during the migration, then repeat steps 3 and 4 on the staging table.
To lessen your downtime, you can store all transactions that occur during the migration in a staging table. After the source table is migrated to the target Account, push the new transactions from the staging table to the target table.
The time required to migrate tables with Amazon EMR can vary. The time depends on the DynamoDB table's provisioned throughput network performance and the amount of data stored in the table.
Note: This option requires you to create and maintain an EMR cluster.
AWS Data Pipeline
Use AWS Data Pipeline to export data from a DynamoDB table to a file in an Amazon S3 bucket. This option runs managed Hadoop clusters in Amazon EMR to perform reads and writes between DynamoDB and Amazon S3.
Note: This option requires you to set up and manage AWS Data Pipeline, cross-account permissions, and IAM roles.
Amazon S3 and AWS Glue
AWS Glue ETL jobs support reading data from another account's DynamoDB table and writing data into another account's DynamoDB table. Use the dynamodb.sts.roleArn parameter to assume a cross-account role in the job script. When you assume the role, you get temporary credentials that must be used for cross-account access to DynamoDB. For more information, see Cross-account cross-Region access to DynamoDB tables and How to export an Amazon DynamoDB table to Amazon S3 using AWS Step Functions and AWS Glue.
Note: This option requires extensive use of Spark and for you to maintain the source code for your AWS Glue ETL job. For more information, see DynamoDB connections.
Custom export and import script
For smaller datasets around 2 GB, or for one-time transfers, you can use a manual export and import process. For example, you can use a C# code that uses DynamoDB Scan operations to read items from the source table. Then, the code uses BatchWriteItem API calls to write the data to a destination table in the target account.
Note: This process can be time consuming for large datasets and requires custom scripting. For the IAM entity that runs the script, you must configure cross-account S3 access.
Related information
How can I migrate my Amazon DynamoDB tables from one AWS account to another?
Relevant content
- asked a year agolg...
- asked 4 years agolg...
- asked a year agolg...
- asked 2 months agolg...
- AWS OFFICIALUpdated a year ago
- AWS OFFICIALUpdated 2 years ago
- AWS OFFICIALUpdated a year ago
- AWS OFFICIALUpdated 2 years ago