- Newest
- Most votes
- Most comments
I apologize, but there seems to be a misunderstanding in your question. There has not been any recent change to the S3 Glacier Deep Archive storage class that would limit file sizes to 5GB. S3 Glacier Deep Archive continues to support objects of various sizes, including those larger than 5GB.
The error message you're encountering is not related to a limitation of the S3 Glacier Deep Archive storage class itself. Instead, it's likely related to the method you're using to change the storage class of your objects.
When changing the storage class of existing objects in S3, there are different methods available, and some of these methods have limitations:
- Using the S3 console to change the storage class has a limitation of 5GB for individual objects.
- Using the AWS CLI or SDKs, you can change the storage class of objects larger than 5GB.
For objects larger than 5GB, you should use the AWS CLI, SDKs, or REST API to change the storage class. These methods support changing the storage class for objects up to 5TB in size.
To resolve your issue and backup your folder fully as Glacier Deep Archive:
- Use the AWS CLI or an SDK to change the storage class of your larger files.
- For future uploads, consider setting up a lifecycle rule to automatically transition objects to Glacier Deep Archive after a specified time period.
Rest assured, S3 Glacier Deep Archive remains a viable and cost-effective option for long-term storage of large files. There's no need to justify new storage expenses to management due to size limitations, as the storage class itself hasn't changed in this regard.
If you continue to experience issues, I recommend contacting AWS Support for further assistance, as they can provide more specific guidance based on your account and configuration.
Sources
Understanding archival storage in S3 Glacier Flexible Retrieval and S3 Glacier Deep Archive - Amazon Simple Storage Service
Coming Soon – S3 Glacier Deep Archive for Long-Term Data Retention
Understanding S3 Glacier storage classes for long-term data storage - Amazon Simple Storage Service
Interesting. I have always changed storage class using the S3 console without issue before, even for files >5GB. I will try using the CLI. Thanks.
You can move an object if your object is less than 5 GB. If your object is greater than 5 GB, you must use the AWS CLI or AWS SDKs to move an object.
[+] https://docs.aws.amazon.com/AmazonS3/latest/userguide/copy-object.html#MovingObjectsExamples
You can change an object's storage class using the Amazon S3 console if the object size is less than 5 GB. If larger, we recommend adding an S3 Lifecycle configuration to change the object's storage class.
[+] https://docs.aws.amazon.com/AmazonS3/latest/userguide/sc-howtoset.html#changing-storage-class
You can move an object if your object is less than 5 GB. If your object is greater than 5 GB, you must use the AWS CLI or AWS SDKs to move an object.
[+] https://docs.aws.amazon.com/AmazonS3/latest/userguide/copy-object.html#MovingObjectsExamples
You can change an object's storage class using the Amazon S3 console if the object size is less than 5 GB. If larger, we recommend adding an S3 Lifecycle configuration to change the object's storage class.
[+] https://docs.aws.amazon.com/AmazonS3/latest/userguide/sc-howtoset.html#changing-storage-class

Why is it no longer possible to make such changes in the AWS console? It was always possible in the past.
You can move an object if your object is less than 5 GB. If your object is greater than 5 GB, you must use the AWS CLI or AWS SDKs to move an object.
[+] https://docs.aws.amazon.com/AmazonS3/latest/userguide/copy-object.html#MovingObjectsExamples
You can change an object's storage class using the Amazon S3 console if the object size is less than 5 GB. If larger, we recommend adding an S3 Lifecycle configuration to change the object's storage class.
[+] https://docs.aws.amazon.com/AmazonS3/latest/userguide/sc-howtoset.html#changing-storage-class