- Newest
- Most votes
- Most comments
Hello, It seems you need a Cloud Watch alarms for logs in Cloud Watch. You cannot directly create alarms for logs but you have another way to create alarms.
Initially you have to create a metric for your log group, you can achieve from selecting specific log group you want and in Actions Drop down select Create Metric Filter. You can select your required filter pattern and create a Metric for your Log Group.
Now you can easily create Cloud watch Alarms to Metrics and have SNS Notifications.
For more information Refer this document:- https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CreateMetricFilterProcedure.html
With the built-in capabilities of CloudWatch Logs, I believe the best that can be accomplished is to create a metric filter: https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/MonitoringLogData.html
It will produce a CloudWatch metric, such as a simple 1 to indicate that the log entry represents a failover event and 0 to show that it doesn't. That numeric value can then be alerted on the same way as any regular numeric metric, like a CPU utilisation measurement.
What this won't allow doing is processing the individual log messages and taking action based on their contents, such as identifying the firewall that performed the failover and sending an email notification identifying the firewall. The ideal service for that wouldn't be CloudWatch Logs but EventBridge, which is an event service bus capable of filtering, routing, and duplicating events between services, but that won't help with logs delivered to CW Logs.
It's possible to construct selective log event processing based on CW Logs, but it would require combining multiple services. I think you'd have to start with subscription filters in CW Logs (https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/Subscriptions.html) to send the log entries to another service. If the amount of log data is small, simply having it processed by a Lambda function to trigger the needed actions might suffice. If log data arrives at a high rate, the stream would probably first have to be sent to an Amazon Data Firehose stream, which would be able to buffer multiple messages and combine them into a batch before triggering the processing logic.
Relevant content
- asked 5 months ago
- AWS OFFICIALUpdated 9 months ago
