CloudWatch Metric filters with default value not making sense

0

I've created a metric filter just to get an idea about how many times a specific log pattern shows up, nothing crazy. Metric value is set to 1 and Default value is set to 0. Since it's not a high-resolution metric, CloudWatch is aggregating it in a minute period. All good with that. What I do not understand is the difference between the Sum and Sample Count statistics. Why Sum and Sample Count would have different values?

  • If we assume that there was no record in the 1-minute interval with the filter pattern, Sum would be 0, and Sample Count would be 0.
  • If we assume that there was at least one record in the 1-minute interval with the filter pattern, Sum would be X, and Sample Count would be X, where X is greater than 0.

An example: Let's say I created a metric filter with the pattern "ERROR:", and I set Metric value is set to 1 and Default value is set to 0. We have the following logs for three different log streams under the same log group in a specific minute in the timeline:

Log stream 1:

  • ERROR: XXXXXXX
  • INFO: XXXXXX

Log stream 2:

  • INFO: XXXXXX
  • INFO: XXXXXX

Log stream 3:

  • ERROR: XXXXXXX
  • ERROR: XXXXXXX
  • ERROR: XXXXXXX

What would be the values for Sum and Sample Count in your opinion? 4, right!?

질문됨 7달 전78회 조회
답변 없음

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠