Skip to content

AWS S3 Keep Calculating Folder Size and Freezed

0

Hey AWS Community, This is Viktor, Recently, we are calculating a specific folder's size under a bucket, and want to get the storage cost. The first approach is using AWS UI, select the folder, and goto action, then clicked Calculate total size, but it freeze while it get around 35TB data The second approach is using AWS Cli, recursively calculate the total storage under the folder, but it show up session time out after calculate 1 hr.

So, is there any solution we can calculate a big folder's size under a bucket?

asked 3 years ago4.2K views
3 Answers
2
Accepted Answer

I understand, you are not able to calculate the total size of folder and I have also faced that earlier quite a few times where folder size was more than 100TBs.

I'd encourage you to use s3 inventory feature, which would work great in your situation, only thing is that you may have to wait for 24-48 hours to get inventory file available.

Also, please take a look at this Storage Blog, which lists plenty of options along with s3 inventory. S3 inventory creation process is quite self explanatory and it'd give you the bucket policy, which you would need to update on the bucket -> bucket policy, where you'd save the inventory file. Usually it doesn't take that long.

If you don't want to do this through S3 inventory, then I'd suggest you to check your CLI configuration, what are the timeout configurations, are those default, if so then you can configure them as well by following this AWS CLI Configuration Guide and then run the command, which is already listed in above mentioned guide:

  aws s3 ls --summarize --human-readable --recursive s3://<bucket-name>/

Let me know how it goes.

AWS
EXPERT
answered 3 years ago
EXPERT
reviewed 3 years ago
0

Am I correct in assuming that the following commands will cause a timeout?

aws s3 ls s3://buckname/dirname/ --summarize --recursive --human-readable

I have never experienced this, but it may stop if the number of files is large or the size is too large.
Therefore, it may be effective to create a shell script that uses "aws s3api list-objects" to generate size totals when "NextMarker" is included in the response.
https://awscli.amazonaws.com/v2/documentation/api/latest/reference/s3api/list-objects.html

EXPERT
answered 3 years ago
EXPERT
reviewed 3 years ago
0

Have you taken a look at Storage Lens?

AWS
EXPERT
answered 3 years ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.