S3 Data transfer costs - specification (per bucket)

0

Hi,

I was wondering if there's a way to get a specification for my S3 data transfer costs (per bucket) anywhere in the admin panel or via the aws CLI tool. I've just recently enabled these metrics on my bucket but was wondering if there's another way for me to access this information regarding the last 6 months.

If anyone could point me in the right direction I would be most thankful.

thanks,

已提问 9 个月前388 查看次数
2 回答
2

To view the cost per bucket, you'd need to use tagging feature and on the basis of those tags, you can filter your bucket in cost explorer:

Please refer following two documentations, which talk about same:

  1. Go to Cost Explorer -> Choose Date Range in right pane
  2. Granularity -> Monthly
  3. Dimension -> Usage Type
  4. Service -> S3
  5. Tag -> Select Tag to filter the bucket -> choose tag value
  6. Apply

This would show you when and what type of usage costed how much.

If you want to setup alert, you can consider setting up AWS budgets and configure alarm on those, which would notify you, if you cross the defined usage threshold. Follow the Well Architected Lab instructions here for setting up budget alert based on your usage.

If you want to calculate/estimate s3 usage, refer s3 pricing and pricing calculator.

Hope you find this helpful.

Comment here if you have additional questions, happy to help.

Abhishek

profile pictureAWS
专家
已回答 9 个月前
  • It might work with small workloads, but won't work very well at scale. With this method, you have to tag each individual bucket with the tag key BucketName (or something like that), and the tag value would be different for each individual bucket, which is not a very efficient use of tagging. Imagine if you had thousands of buckets across hundreds of accounts, the complexity of governance for this method will make it hard to maintain. The better option would be to utilize Cost and Usage Report (check response from Dave Connelly AWS about it) to track individual resources/buckets cost & usage

1

To identify the buckets that are responsible for high data transfer, check your S3 usage report. The report helps you to review the operation, Region, and time when the data transfer occurred

Sign in to the AWS Management Console and open the Amazon S3 console at https://console.aws.amazon.com/s3/.

In the title bar, choose your user name or account ID, and then choose Billing Dashboard. In the navigation pane, choose Cost & usage reports. Under AWS Usage Report, choose Create a Usage Report. For Services, choose Amazon Simple Storage Service. For Download Usage Report, choose the following settings:

  • Usage Types – For a detailed explanation of Amazon S3 usage types, see Understanding your AWS billing and usage reports for Amazon S3.
  • Operation – For a detailed explanation of Amazon S3 operations, see Tracking Operations in Your Usage Reports.
  • Time Period– The time period that you want the report to cover.
  • Report Granularity– Whether you want the report to include subtotals by the hour, by the day, or by the month.
  • Choose the **Download **format and follow the prompts to open or save the report.

Review the S3 server access logs that are associated with the buckets that are responsible for high data transfer charges. This helps you to view detailed information about the requests. You can query the server access logs using Amazon Athena to get information on a specific date and time, operations, and requesters. For example, run the following query to see the amount of data that was transferred through a certain IP address during a specific time period:

SELECT SUM(bytessent) as uploadtotal,SUM(objectsize) as downloadtotal,SUM(bytessent + objectsize) AS total FROM s3_access_logs_db.mybucket_logsWHERE remoteIP='1.2.3.4' AND parse_datetime(requestdatetime,'dd/MMM/yyyy:HH:mm:ss Z')BETWEEN parse_datetime('2021-07-01','yyyy-MM-dd')AND parse_datetime('2021-08-01','yyyy-MM-dd');

profile pictureAWS
已回答 9 个月前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则