Skip to content

Timestream pricing per query

3

If I understand it now, queries cost $0.518 per TCU hour, with a minimum cost of 30 seconds per query, so the minimum cost for a single, simple query that doesn't scan a lot of data is:

($0.518 / hour) * (30 seconds) = 0.00431666667 U.S. dollars

Compare that to $0.01 per GB scanned with a minimum of 10MB. This means that the minimum query you could run would cost:

($0.01 / GB) * (10 MB) = 0.0001 U.S. dollar

In other words, the simplest query you can make now costs 4.3 times as much as it did before?

Looking at this another way, suppose you had a dashboard reading this data, performing a very simple query (like latest values, or avg value over the last hour), scanning not much data. Suppose this refreshes once a minute (with 43,800 minutes in a month)

Old pricing: 43800 * 0.001 = $43.80/month New pricing: 43800 * 0.00431666667 = $189.07/month

Is this really what happened? How is this justifiable? It seems crazy. Unless I'm misunderstanding the 30 second minimum - it seems to me that it's a per-query minimum, do I have that wrong?

If I do not have it wrong - and it really is a 4x price increase - then this is going to drive me off of timestream for most use cases. It's just not worth it at this point. And the minimal managed influx instance is $87.65/month. I'm not doing that, either; I'd rather just install my own t3.medium and install influx on it myself. Maybe I'm just being a cheapskate, but these price points seem to put Timestream out as an option, not just for big systems, but also for small ones.

What am I missing?

asked a year ago333 views
1 Answer
1

Hello,

Thanks for contacting AWS.

Please find the pricing calculation below for your scenario:

The minimum number of TCUs used for the query is 4 TCU. (https://docs.aws.amazon.com/timestream/latest/developerguide/tcu.html#maxquery-tcu)

4 TCU* (30 seconds/3600) = 0.033 (As TCUs are billed on an hourly basis with per-second granularity)

0.033 TCU-Hours * 0.518 /TCU-hour = $0.017094 /hour

$0.017094/hour *720 hours in a month = $12.30768.

As the queries can be executed concurrently (at least 7 queries per 4 TCU), the cost for 7 queries every 1 minute will be the same as 1 query every 1 minute. (https://docs.aws.amazon.com/timestream/latest/developerguide/tcu.html#estimate-compute-units)

Now, considering other example,

If 1 query every 1 minute run for 8 seconds then, the calculation will be as follows:

4 TCU* 38 seconds (used time including the minimum) * 60 queries (per hour) = 9120 seconds or 2.533 TCU-hours.

2.533 TCU-Hours * 0.518 /TCU-hour (price in IAD) = $1.3 /hour

$1.3/hour *720 hours in a month = $936.

Please note that, if you are looking for anything specific then please reach out to our AWS Billing Support team with the required data.

AWS
SUPPORT ENGINEER
answered a year ago
  • If I work this for an alert query that runs once a minute: Number of Real-Time queries: 1 per minute / (60 seconds in a minute) = 0.02 per second 0.02 Queries / 7 = 0.002857142857142857 Max (0.002857142857142857 , 1 ) = 1.00 Rounding (1.00) = 1 4 TCU x 1 = 4 TCU 4 TCU x 0.518 USD x 730 hours in a month = 1,512.56 USD for real-time queries

    What I'm still struggling with is that $1512/month is a lot to have a single alert evaluated once/minute for a month isn't it?

  • What is still not clear is what it means to have a 30-second minimum where TCUs are billed hourly with 1-second granularity. That means I pay a minimum of 30 seconds per hour? From the second example, it seems more like the 30-second minimum is applied per minute. Although I would think “used time including the minimum” should be 30 seconds, not 38 seconds.

  • @aryehleib What I gather is that it behaves sort of like a container. When you make a request, it spins up FOUR of these containers (that's more than $2/hr btw), and the container stays alive for 30 seconds. If your query takes 200ms you pay for 30s anyway. OTOH if your query takes 200ms, you can make MANY queries within that 30s for the same price.

  • I think the problem really is that this scales up but it doesn't scale down. If you had an alarm query or dashboard that made one query a minute, even if that query only took 100 msec to run, you'd pay for 30 seconds * 4 TCU (minimum) = ~$750/month. If you have hundreds of thousands of devices this pricing structure scales up, but if you have a few devices and minute-by-minute alert queries then it doesn't seem to me that Timestream is a viable option.

  • @wz2b yes, I see you are correct. It runs for 30s. So in the second example, If 1 query every 1 minute run for 8 seconds then, the calculation should be as follows: 4 TCU * 30 seconds (used time is included in the minimum) * 60 queries (per hour) = 2 TCU/hour or about $1/hour Or another way of looking at is is if you only run one query at a time and each query takes 30s or less, you will pay at least 1.7¢ per query And indeed, it doesn’t scale down. If you can use DynamoDB for your use-case, it seems better priced for small-scale use, since it doesn’t have any minimum units and pricing is directly proportional to the number and size of records you write or read/scan. Also, you can get consumed units for reads and writes directly in the api response. Timestream query behavior is non-intuitive (in that it doesn’t leverage the time index to avoid scanning) and there is no direct visibility into TCU consumed, so optimizing queries is a bit of shooting in the dark.

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.