CloudWatch Metric filters with default value not making sense

0

I've created a metric filter just to get an idea about how many times a specific log pattern shows up, nothing crazy. Metric value is set to 1 and Default value is set to 0. Since it's not a high-resolution metric, CloudWatch is aggregating it in a minute period. All good with that. What I do not understand is the difference between the Sum and Sample Count statistics. Why Sum and Sample Count would have different values?

  • If we assume that there was no record in the 1-minute interval with the filter pattern, Sum would be 0, and Sample Count would be 0.
  • If we assume that there was at least one record in the 1-minute interval with the filter pattern, Sum would be X, and Sample Count would be X, where X is greater than 0.

An example: Let's say I created a metric filter with the pattern "ERROR:", and I set Metric value is set to 1 and Default value is set to 0. We have the following logs for three different log streams under the same log group in a specific minute in the timeline:

Log stream 1:

  • ERROR: XXXXXXX
  • INFO: XXXXXX

Log stream 2:

  • INFO: XXXXXX
  • INFO: XXXXXX

Log stream 3:

  • ERROR: XXXXXXX
  • ERROR: XXXXXXX
  • ERROR: XXXXXXX

What would be the values for Sum and Sample Count in your opinion? 4, right!?

已提问 7 个月前78 查看次数
没有答案

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则