How can I get the size of an Amazon S3 bucket?

1

s3cmd seems to access every file individually and isn't very scalable. Is there a scalable way to get the answer?

Edit: Ideally via the command line.

jedberg
質問済み 3年前1983ビュー
4回答
5

In the CLI you can list recursively and summarize in a human readable format.

aws s3 ls s3://mybucket --recursive --human-readable --summarize
... [snipped output] ...
Total Objects: 46
   Total Size: 29.5 MiB
profile picture
エキスパート
bwhaley
回答済み 3年前
profile pictureAWS
エキスパート
レビュー済み 2年前
  • Will that work if your bucket has, say, 50 million objects in it?

  • It should work, though it will take a long time. I just listed a bucket with 6,363,094 objects, 24.9 GiB in size. Took 54 minutes.

4

Using the BucketSizeBytes CloudWatch Metric (ref: https://docs.aws.amazon.com/AmazonS3/latest/userguide/metrics-dimensions.html) is probably your best bet - and being it CloudWatch, you can also alarm on size, use triggers etc as you would do with any other metric.

Need to be careful though, this metric might be a few hours behind.

AWS
回答済み 3年前
profile pictureAWS
エキスパート
レビュー済み 2年前
1

In the S3 console you can go navigate to your bucket and select all objects in the bucket. Once all objects are selected you can go to "Actions" > "Calculate total size" Hope that helps

AWS
Mike_C
回答済み 3年前
  • Is there a command line version of that?

  • Not that i am aware of. You could try this command though. Output is in bytes fyi:

    aws s3api list-objects --bucket BUCKETNAME --output json --query "[sum(Contents[].Size), length(Contents[])]"

0

There are alternative ways.

  • Cloudwatch Metrics.
  • Cost Explorer API, UI or Billing.
  • Generate an S3 Inventory, use Athena to run query to sum sizes. Might be overkill though, but you can filter as you wish if you need more than just the usage data.
回答済み 2年前

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ