- Newest
- Most votes
- Most comments
Hello,
Thanks for contacting AWS.
Please find the pricing calculation below for your scenario:
The minimum number of TCUs used for the query is 4 TCU. (https://docs.aws.amazon.com/timestream/latest/developerguide/tcu.html#maxquery-tcu)
4 TCU* (30 seconds/3600) = 0.033 (As TCUs are billed on an hourly basis with per-second granularity)
0.033 TCU-Hours * 0.518 /TCU-hour = $0.017094 /hour
$0.017094/hour *720 hours in a month = $12.30768.
As the queries can be executed concurrently (at least 7 queries per 4 TCU), the cost for 7 queries every 1 minute will be the same as 1 query every 1 minute. (https://docs.aws.amazon.com/timestream/latest/developerguide/tcu.html#estimate-compute-units)
Now, considering other example,
If 1 query every 1 minute run for 8 seconds then, the calculation will be as follows:
4 TCU* 38 seconds (used time including the minimum) * 60 queries (per hour) = 9120 seconds or 2.533 TCU-hours.
2.533 TCU-Hours * 0.518 /TCU-hour (price in IAD) = $1.3 /hour
$1.3/hour *720 hours in a month = $936.
Please note that, if you are looking for anything specific then please reach out to our AWS Billing Support team with the required data.
Relevant content
- asked 2 years ago
- asked 3 years ago
- AWS OFFICIALUpdated 8 months ago
- AWS OFFICIALUpdated 3 years ago

If I work this for an alert query that runs once a minute: Number of Real-Time queries: 1 per minute / (60 seconds in a minute) = 0.02 per second 0.02 Queries / 7 = 0.002857142857142857 Max (0.002857142857142857 , 1 ) = 1.00 Rounding (1.00) = 1 4 TCU x 1 = 4 TCU 4 TCU x 0.518 USD x 730 hours in a month = 1,512.56 USD for real-time queries
What I'm still struggling with is that $1512/month is a lot to have a single alert evaluated once/minute for a month isn't it?
What is still not clear is what it means to have a 30-second minimum where TCUs are billed hourly with 1-second granularity. That means I pay a minimum of 30 seconds per hour? From the second example, it seems more like the 30-second minimum is applied per minute. Although I would think “used time including the minimum” should be 30 seconds, not 38 seconds.
@aryehleib What I gather is that it behaves sort of like a container. When you make a request, it spins up FOUR of these containers (that's more than $2/hr btw), and the container stays alive for 30 seconds. If your query takes 200ms you pay for 30s anyway. OTOH if your query takes 200ms, you can make MANY queries within that 30s for the same price.
I think the problem really is that this scales up but it doesn't scale down. If you had an alarm query or dashboard that made one query a minute, even if that query only took 100 msec to run, you'd pay for 30 seconds * 4 TCU (minimum) = ~$750/month. If you have hundreds of thousands of devices this pricing structure scales up, but if you have a few devices and minute-by-minute alert queries then it doesn't seem to me that Timestream is a viable option.
@wz2b yes, I see you are correct. It runs for 30s. So in the second example, If 1 query every 1 minute run for 8 seconds then, the calculation should be as follows: 4 TCU * 30 seconds (used time is included in the minimum) * 60 queries (per hour) = 2 TCU/hour or about $1/hour Or another way of looking at is is if you only run one query at a time and each query takes 30s or less, you will pay at least 1.7¢ per query And indeed, it doesn’t scale down. If you can use DynamoDB for your use-case, it seems better priced for small-scale use, since it doesn’t have any minimum units and pricing is directly proportional to the number and size of records you write or read/scan. Also, you can get consumed units for reads and writes directly in the api response. Timestream query behavior is non-intuitive (in that it doesn’t leverage the time index to avoid scanning) and there is no direct visibility into TCU consumed, so optimizing queries is a bit of shooting in the dark.