Replica of volume of 20tb

1

I need to implement automation of a scenario Copy the snapshot based on instance fron other account to my account then prepare volume id based on snapshot with kms keys Then perform few activities on the volume and also support this upto 20 tbs .

Could you please help me best solution for automation on the above scenario

已提问 1 个月前119 查看次数
1 回答
3
已接受的回答

I recommend starting by copying the snapshot from the other account to yours. Next, create a volume from the copied snapshot using your preferred KMS key. Finally, you can carry out various operations on the volume, such as attaching it to an EC2 instance, formatting it, and mounting it. This solution supports volumes up to 20TB using the Provisioned IOPS SSD (io1 or io2) volume type, which can handle volume sizes up to 64TB.

Here's a streamlined overview of the process:

# Copy Snapshot from Other Account
aws ec2 copy-snapshot --source-region <source-region> --source-snapshot-id <source-snapshot-id> --kms-key-id <kms-key-id> --destination-region <destination-region> --encrypted

# Create Volume from Copied Snapshot with KMS Key
aws ec2 create-volume --snapshot-id <copied-snapshot-id> --volume-type io1 --iops 3000 --size 20000 --kms-key-id <kms-key-id> --availability-zone <availability-zone>

# Attach Volume to EC2 Instance
aws ec2 attach-volume --volume-id <volume-id> --instance-id <instance-id> --device /dev/sdf

# Format and Mount Volume
mkfs -t ext4 /dev/sdf
mount /dev/sdf /mnt

ℹ️ You would need to replace <source-region>, <source-snapshot-id>, <kms-key-id>, <destination-region>, <copied-snapshot-id>, <availability-zone>, <volume-id>, and <instance-id> with your specific values.


⚠️ Ensure you have an IAM role with a cross-account policy.

Key Resources:


💡 Additionally, you can consider using AWS Lambda to orchestrate the entire process, which would provide more flexibility and maintainability in the long run.

profile picture
专家
已回答 1 个月前
profile picture
专家
已审核 1 个月前
profile pictureAWS
专家
已审核 1 个月前
  • Thanks for the response , already i implemented step function with lambda but we have limitation for lambda execution time 900 sec. Now i am divide the task multiple lambda and achieving Max. 80 GB . Could you please suggest how my solution will support upto 20 TB.

  • Alright, you're correct about the Lambda execution time limitation. In this case, I'd suggest using AWS Data Pipeline. With Data Pipeline, you can define workflow activities similar to what I described earlier, such as copying snapshots, creating volumes, attaching volumes, and formatting and mounting them.

    ℹ️ Data Pipeline is not limited by the Lambda execution time and allows you to efficiently move up to 20TB of data. Additionally, with Data Pipeline, you only pay for the services you use, making it a cost-effective solution for your needs.

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则

相关内容