Replica of volume of 20tb

1

I need to implement automation of a scenario Copy the snapshot based on instance fron other account to my account then prepare volume id based on snapshot with kms keys Then perform few activities on the volume and also support this upto 20 tbs .

Could you please help me best solution for automation on the above scenario

질문됨 한 달 전118회 조회
1개 답변
3
수락된 답변

I recommend starting by copying the snapshot from the other account to yours. Next, create a volume from the copied snapshot using your preferred KMS key. Finally, you can carry out various operations on the volume, such as attaching it to an EC2 instance, formatting it, and mounting it. This solution supports volumes up to 20TB using the Provisioned IOPS SSD (io1 or io2) volume type, which can handle volume sizes up to 64TB.

Here's a streamlined overview of the process:

# Copy Snapshot from Other Account
aws ec2 copy-snapshot --source-region <source-region> --source-snapshot-id <source-snapshot-id> --kms-key-id <kms-key-id> --destination-region <destination-region> --encrypted

# Create Volume from Copied Snapshot with KMS Key
aws ec2 create-volume --snapshot-id <copied-snapshot-id> --volume-type io1 --iops 3000 --size 20000 --kms-key-id <kms-key-id> --availability-zone <availability-zone>

# Attach Volume to EC2 Instance
aws ec2 attach-volume --volume-id <volume-id> --instance-id <instance-id> --device /dev/sdf

# Format and Mount Volume
mkfs -t ext4 /dev/sdf
mount /dev/sdf /mnt

ℹ️ You would need to replace <source-region>, <source-snapshot-id>, <kms-key-id>, <destination-region>, <copied-snapshot-id>, <availability-zone>, <volume-id>, and <instance-id> with your specific values.


⚠️ Ensure you have an IAM role with a cross-account policy.

Key Resources:


💡 Additionally, you can consider using AWS Lambda to orchestrate the entire process, which would provide more flexibility and maintainability in the long run.

profile picture
전문가
답변함 한 달 전
profile picture
전문가
검토됨 한 달 전
profile pictureAWS
전문가
검토됨 한 달 전
  • Thanks for the response , already i implemented step function with lambda but we have limitation for lambda execution time 900 sec. Now i am divide the task multiple lambda and achieving Max. 80 GB . Could you please suggest how my solution will support upto 20 TB.

  • Alright, you're correct about the Lambda execution time limitation. In this case, I'd suggest using AWS Data Pipeline. With Data Pipeline, you can define workflow activities similar to what I described earlier, such as copying snapshots, creating volumes, attaching volumes, and formatting and mounting them.

    ℹ️ Data Pipeline is not limited by the Lambda execution time and allows you to efficiently move up to 20TB of data. Additionally, with Data Pipeline, you only pay for the services you use, making it a cost-effective solution for your needs.

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠