- Newest
- Most votes
- Most comments
I understand, you are not able to calculate the total size of folder and I have also faced that earlier quite a few times where folder size was more than 100TBs.
I'd encourage you to use s3 inventory feature, which would work great in your situation, only thing is that you may have to wait for 24-48 hours to get inventory file available.
Also, please take a look at this Storage Blog, which lists plenty of options along with s3 inventory. S3 inventory creation process is quite self explanatory and it'd give you the bucket policy, which you would need to update on the bucket -> bucket policy, where you'd save the inventory file. Usually it doesn't take that long.
If you don't want to do this through S3 inventory, then I'd suggest you to check your CLI configuration, what are the timeout configurations, are those default, if so then you can configure them as well by following this AWS CLI Configuration Guide and then run the command, which is already listed in above mentioned guide:
aws s3 ls --summarize --human-readable --recursive s3://<bucket-name>/
Let me know how it goes.
Am I correct in assuming that the following commands will cause a timeout?
aws s3 ls s3://buckname/dirname/ --summarize --recursive --human-readable
I have never experienced this, but it may stop if the number of files is large or the size is too large.
Therefore, it may be effective to create a shell script that uses "aws s3api list-objects" to generate size totals when "NextMarker" is included in the response.
https://awscli.amazonaws.com/v2/documentation/api/latest/reference/s3api/list-objects.html
Relevant content
- asked a year ago
- Accepted Answer
- asked 6 months ago
- AWS OFFICIALUpdated 8 months ago
- AWS OFFICIALUpdated a year ago
- AWS OFFICIALUpdated 7 months ago
- AWS OFFICIALUpdated 8 months ago