Migrate data using windows deduplication function to FSx(windows)

0

Hello

Currently, 100TB of data is compressed to 50TB using the deduplication function of Windows.(EC2 on AWS)
I want to migrate this data to FSx for windows.
(The FSx pricing structure costs because of storage and throughput capacity)

ec2⇒robocopy⇒FSx for windows

However, if you remove the deduplication function when performing data migration, the data capacity will return to 100 TB and a lot of costs will be incurred.

Is there a specific way to migrate data to FSx with the data deduplicated?
We want to reduce the cost and migrate data to FSx.

I want to use windows robocopy as the data migration method.

fuka
已提問 4 年前檢視次數 526 次
2 個答案
0

Hi fuka,

In this case, you can create an Amazon FSx file system that is closer to the size you expect to get to - 50 TB. As soon as you create the Amazon FSx file system, you can enable Data Deduplication, and set the Deduplication optimization schedule to run aggressively (see our documentation for details on how to set the schedule: https://docs.aws.amazon.com/fsx/latest/WindowsGuide/using-data-dedup.html). When you then initiate the copy of your existing file content to your Amazon FSx file system, the data will be continuously getting deduplicated, and your overall data set can fit in the smaller file system.

Thank you,
Amazon FSx team

AWS
已回答 4 年前
  • Bulk data transfers with deduplication enabled is not recommended.

  • Per the FSx user guide:

    "Warning It is not recommended to run certain Robocopy commands with data deduplication because these commands can impact the data integrity of the Chunk Store. For more information, see the Microsoft Data Deduplication interoperability documentation."

0

Thank you !

fuka
已回答 4 年前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南