Ir para o conteúdo

How to change the storage class on a bucket

0

Good day all,

I have created a S3 bucket with a lifecycle to move the data that hits the bucket to Glacier instant retrieval and would like to convert that data to a Deep Archive model instead.

My thought process was to either

  1. Disable the lifecycle and convert the whole bucket into a Deep Archive class but not sure if it would apply to what's already in the Glacier Instant, or
  2. Run a CLI script to move the data but will it duplicate the data and is here any fees?

What would be the best way to do this? I am kind of new to AWS so I go with what I read online

Thanks in advance

feita há 2 meses117 visualizações
4 Respostas
0

To change your S3 objects from Glacier Instant Retrieval to Deep Archive, you have a few options:

For objects already in Glacier Instant Retrieval:

  1. You can use the AWS CLI to copy the objects in place while changing their storage class. This approach doesn't duplicate your data but updates the storage class:
aws s3 cp s3://your-bucket/ s3://your-bucket/ --recursive --storage-class DEEP_ARCHIVE
  1. You can create a new lifecycle policy to transition objects from Glacier Instant Retrieval to Deep Archive. However, there's an important limitation: objects in Glacier Instant Retrieval cannot be directly transitioned to most other storage classes except for Glacier Deep Archive.

For future objects:

  1. Update your existing lifecycle configuration to transition objects directly to Deep Archive instead of Glacier Instant Retrieval.

Important considerations:

  • Simply disabling the current lifecycle rule won't change the storage class of objects already in Glacier Instant Retrieval
  • Using the CLI copy command doesn't create duplicate objects but does count as a new PUT operation, which may incur some fees
  • When transitioning between storage classes, be aware of minimum storage duration requirements to avoid additional charges
  • The storage class change isn't immediate and may take some time to complete

If you're managing a large number of objects, the lifecycle approach is more automated, while the CLI approach gives you more immediate control over the transition.
Sources
Setting the storage class of an object - Amazon Simple Storage Service
S3 bucket lifecycle policy through CLI | AWS re:Post
Setting an S3 Lifecycle configuration on a bucket - Amazon Simple Storage Service
Transitioning objects using Amazon S3 Lifecycle - Amazon Simple Storage Service

respondido há 2 meses
0

You need to do a couple of things -

  • Update your existing lifecycle rule to move objects from Glacier to Deep Archive. See screenshot.
  • Update your current process for uploading objects to S3 and make sure to specify the storage class as suggested by repost agent.

Note, S3 bucket don't have a default setting that you can override for storage class. The default is Standard.

Enter image description here

ESPECIALISTA
respondido há 2 meses
0

Thank you very much for the fast response :)

What I attempt is

  1. Run the aws s3 cp s3://your-bucket/ s3://your-bucket/ --recursive --storage-class DEEP_ARCHIVE command
  2. Changed my Lifecycle jobs to send to Deep Archive
respondido há 2 meses
0

My session timed out and there is nothing in the CloudShell anymore ?

respondido há 2 meses

Você não está conectado. Fazer login para postar uma resposta.

Uma boa resposta responde claramente à pergunta, dá feedback construtivo e incentiva o crescimento profissional de quem perguntou.