move fsx backuo to s3 then to glacier

0

HI, I have created a Windows fsx with daily backup schedule. Now, I want to move these backup files s3 then to Glacier. I have planned to move daily backups in one folder then after seven days this week daily should to moved to the weekly backup then this four weeks backup file should move to the monthly backup folder through s3 then it should move to the glacier after 7 days to 4 weeks and 4 weeks to monthly. Can anyone help me to do with some scripts and steps? I just completed the certification and still learning.

I really appreciate any help you can provide.

2 Answers
3
Accepted Answer

Greetings,

Sure, the first thing to understand is that AWS does not support a direct backup of FSx to S3 or Glacier. However, you can build your own solution to do so.

This solution will involve copying data from FSx to S3, managing the S3 lifecycle to move old data to Glacier, and handling data rotation within the S3 bucket itself. The data rotation (daily to weekly, weekly to monthly) can be done using an AWS Lambda function. Please note that the lifecycle transition to Glacier may not be instantaneous and it can take around 24 hours.

Here are some general steps and scripts to help you start with this setup:

  1. Create an Amazon S3 bucket and configure its lifecycle rules. You can do this manually or via CLI.

Manually: Go to the S3 console, create a bucket, navigate to Management tab > Lifecycle > Add lifecycle rule.

  1. Set up an AWS DataSync task to sync data from your FSx to your S3 bucket. DataSync can automate moving data between on-premises storage, FSx, and S3.

Manually: Go to the AWS DataSync console, create a task, and configure it as required.

Via AWS CLI:

aws datasync create-task --source-location-arn <fsx-location> --destination-location-arn <s3-location>

  1. To manage the data rotation within S3 (daily to weekly, weekly to monthly), create an AWS Lambda function that is triggered by a CloudWatch Event.

In this Lambda function, use the boto3 AWS SDK for Python to handle moving files around in the S3 bucket.

Here's a skeleton of what this Lambda function could look like:

References:

  1. Configuring lifecycle rules https://docs.aws.amazon.com/AmazonS3/latest/userguide/object-lifecycle-mgmt.html

  2. AWS CLI command reference for S3 https://awscli.amazonaws.com/v2/documentation/api/latest/reference/s3/index.html

  3. Amazon FSx https://docs.aws.amazon.com/fsx/index.html

  4. AWS CLI command reference for DataSync https://awscli.amazonaws.com/v2/documentation/api/latest/reference/datasync/index.html

  5. AWS Lambda documentation https://docs.aws.amazon.com/lambda/index.html

Please let me know if you have any further questions

AWS
EXPERT
ZJon
answered 10 months ago
profile picture
EXPERT
reviewed 25 days ago
  • Hi ZJon, Thanks for the response! I believe the lambda script is incomplete and dies it automatically starts this backup automation when I run this code in lambda. Thanks in advance

  • Greetings, I apologize for any confusion caused. The provided script was indeed a skeleton meant to give you an idea of how you could handle data rotation within your S3 bucket. However, it's important to note that the Lambda function itself doesn't automatically start the backup process - the function should be invoked by an event or manually. In your case, you might want to trigger this Lambda function daily using an Amazon CloudWatch Events rule. Please let me know if that helps

  • Hi Zjon, I also need to create a restore point from s3 to fsx. can we connect directly need to discuss something clearly if you're available?

0

It's not clear why would you move files to a folder, you should have a script to move yesterday's files to S3 and in S3, create lifecycle policies.

Moving files to folder, will keep putting new files to that specific folder and still you'll need to extract files based on timestamp of the file not by the name of the folder as that folder would keep getting files everyday. Instead of dealing this file/folder logic, it'd be better to use lifecycle rules to serve that purpose, where you need not to keep files in a folder for longer and let S3 do the job for you based on your requirement.

S3 console -> Open s3 bucket -> Management -> Lifecyclye Rule -> Create Lifecycle Rule ->

  1. Lifecycle rule name
  2. Choose Apply to all objects in the bucket (if this bucket is created only for this specific purpose) otherwise choose Limit the scope of this rule using one or more filters
  3. Lifecycle rule actions -> Move current versions of objects between storage classes -> Choose appropriate storage class(per your requirement) -> Enter number of Days for Days after object creation(based on your use case)

Some of useful resources for S3 storage tiering and pricing are as below:

https://docs.aws.amazon.com/console/s3/pricingconsiderations https://aws.amazon.com/s3/pricing https://docs.aws.amazon.com/AmazonS3/latest/userguide/intelligent-tiering-overview.html

How to transfer files to/from FSX: https://docs.aws.amazon.com/fsx/latest/WindowsGuide/use-data-sync.html

profile pictureAWS
EXPERT
answered 10 months ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions