I was not able to locate a table of the meta data, however this data dictionary may help as a reference:
It appears you can also use Amazon Athena to analyze the data from your AWS Cost and Usage Reports in Amazon S3 using standard SQL:
Hope this helps, but if you are still seeking a solution please update this thread with more info about your use case and I will investigate further.
Hi, thanks for that. Yes I did find that info while trying to figure out how I'm going to use the file.
Another problem is that the CSV file has two different sizes (No of items) depending on where I download it from. I wrote the system to try and get accurate costs (we can't afford to burn our fingers when we go live) I used data from my own AWS account and that has one less field than the company one.
I need stability somewhere, especially on billing side. I also need to know the format of the fields. At the moment Im guessing and might truncate important billing information.
No, the staff can't use Athena to check costs.
Thanks for the feedback about the CSV, Andre. I have let our Insights team know that you are looking for more stability on the billing side.
If you have anything else to add, please let me know. You can also open a support case:
Our Insights team reviewed your feedback. Are you concerned that you have a test account to create CUR-based cost monitoring tool using CSV version? If a CSV option in selected, CUR may break up the file into multiple files depending on the file size.
If the concern is with different columns, that is also expected as the data provided is contextual to the individual account. There should be a CUR manifest file that provides details on each column that is available.
If you're looking to ultimately use this audit mechanism on your company's account, they recommended building it based on that CUR, to avoid issues in the future.
I hope this info helps, but if you have any additional feedback or questions, let me know.
Thanks for the feedback. CUR?
I found the json file that get's generated with the csv file. It has a general description. There are no lengths so I have to guess it...
At the moment I read the JSON file to get the format of the CSV file and then I read the CSV file.
Here is my summary so far...
String = VARCHAR(55)
Interval = VARCHAR(55)
DateTime = DateTime
BigDecimal = Decimal(20,10)
OptionalBigDecimal = Decimal(20,10) Problem with NULL values
It would be nice to have a properly defined length and type for each Item. AWS should know this!!
Is the issue with the CSV file when we add tags?
Update from our Cost & Usage Reports (CUR) team:
We don't provide character lengths, and this should not be an issue when creating tables from JSON (or SQL create functions). You can have changes to the CUR file & modify the lengths from one day to the next - so if you want to parse the data to determine lengths, it's risky as it may change.
For example, if we introduce a new product or service that has longer text lengths, or a new charge type that was not used before, setting fixed lengths has the risk of truncating text.
I hope this addresses your concerns around this, but please let me know if there's anything else I can help with.
If you had any specific account questions that may relate and need someone to take a closer look, you can also open a support case with us in the Support Center:
Thank you. I understand. It would be good to have a dictionary of what possible fields could be expected, but we can work round that.
My biggest problem is that I can't tag objects PUT into S3 (another forum post) so I'm unable to get costs per item. Everything is hanging at the moment...
I'm sorry for the inconvenience. It appears that currently, you can only tag an S3 bucket:
If you haven't yet, have you looked at the Tag Editor yet? It might be helpful to play around with:
As we primarily assist with General Feedback, if you have any more questions or concerns relating to your account please open a support case in the Support Center so we can discuss in detail:
A support case will allow us to log your feedback on our end with your full details, securely, for effective tracking of these requests!
You can also add a new comment in the thread you mentioned or start a new thread for assistance from the community of experts:
No, looks like you can tag objects too...
You can add tags to new objects when you upload them, or you can add them to existing objects.
Only problem is that it is not working...
AWS Cost and Usage format?asked 2 years ago
Identifying amortized cost column in AWS cost and usage reportasked 5 months ago
AWS Cost Explorer API aggregate on USAGE_TYPE_GROUPasked 2 hours ago
API Gateway Count Metrics per API Keyasked 8 days ago
traffic usage of awslightsail, help!!asked a month ago
Cost explorer boto3 cost and usage without specific tagasked 5 months ago
AWS Free Tier: Pricing Estimate if non-freeasked 4 months ago
How to fetch previous months Cur reportsasked 5 months ago
GuardDuty pricing investigationAccepted Answerasked a year ago
Regenerating CUR report for last month when S3 bucket was deleted accidentallyAccepted Answerasked 12 days ago