如何使用 CloudWatch Logs Insights 分析自訂 Amazon VPC 流程日誌?

5 分的閱讀內容
0

我使用 Amazon Virtual Private Cloud (Amazon VPC) 流程日誌設定自訂 VPC 流程日誌。我想使用 Amazon CloudWatch Logs Insights 探索日誌中的模式和趨勢。

簡短說明

CloudWatch Logs Insights 會自動探索使用預設格式的流程日誌,但不會自動發現自訂格式的流程日誌。

若要對使用自訂格式的流程日誌使用 CloudWatch Logs Insights,必須修改查詢。

以下是自訂流程日誌格式的範例:

${account-id} ${vpc-id} ${subnet-id} ${interface-id} ${instance-id} ${srcaddr} ${srcport} ${dstaddr} ${dstport} ${protocol} ${packets} ${bytes} ${action} ${log-status} ${start} ${end} ${flow-direction} ${traffic-path} ${tcp-flags} ${pkt-srcaddr} ${pkt-src-aws-service} ${pkt-dstaddr} ${pkt-dst-aws-service} ${region} ${az-id} ${sublocation-type} ${sublocation-id}

下列查詢是如何自訂和擴展查詢以符合使用案例的範例。

解決方法

擷取最新流程日誌

若要從日誌欄位擷取資料,請使用 parse 關鍵字。例如,下列查詢的輸出會依流程日誌事件開始時間排序,並限制於最近兩個日誌項目。

查詢

#Retrieve latest custom VPC Flow Logs
parse @message "* * * * * * * * * * * * * * * * * * * * * * * * * * *" as account_id, vpc_id, subnet_id, interface_id,instance_id, srcaddr, srcport, dstaddr, dstport, protocol, packets, bytes, action, log_status, start, end, flow_direction, traffic_path, tcp_flags, pkt_srcaddr, pkt_src_aws_service, pkt_dstaddr, pkt_dst_aws_service, region, az_id, sublocation_type, sublocation_id
| sort start desc
| limit 2

輸出

account_idvpc_idsubnet_idinterface_idinstance_idsrcaddrsrcport
123456789012vpc-0b69ce8d04278dddsubnet-002bdfe1767d0ddb0eni-0435cbb62960f230e-172.31.0.10455125
123456789012vpc-0b69ce8d04278ddd1subnet-002bdfe1767d0ddb0eni-0435cbb62960f230e-91.240.118.8149422

按來源和目的地 IP 位址配對彙總資料傳輸

使用下列查詢按來源和目的地 IP 位址配對彙總網路流量。在範例查詢中,sum 統計會彙總位元組欄位。sum 統計會計算在主機之間傳輸的資料累計總計,因此 flow_direction 會包含在查詢和輸出中。彙總的結果會暫時指派給 Data_Transferred 欄位。然後,按照 Data_Transferred 以降序對結果排序,並傳回兩個最大的配對。

查詢

parse @message "* * * * * * * * * * * * * * * * * * * * * * * * * * *" as account_id, vpc_id, subnet_id, interface_id,instance_id, srcaddr, srcport, dstaddr, dstport, protocol, packets, bytes, action, log_status, start, end, flow_direction, traffic_path, tcp_flags, pkt_srcaddr, pkt_src_aws_service, pkt_dstaddr, pkt_dst_aws_service, region, az_id, sublocation_type, sublocation_id
| stats sum(bytes) as Data_Transferred by srcaddr, dstaddr, flow_direction
| sort by Data_Transferred desc
| limit 2

輸出

srcaddrdstaddrflow_directionData_Transferred
172.31.1.2473.230.172.154egress346952038
172.31.0.463.230.172.154egress343799447

按 Amazon EC2 執行個體 ID 分析資料傳輸

您可以使用自訂流程日誌依 Amazon Elastic Compute Cloud (Amazon EC2) 執行個體 ID 分析資料傳輸。為確定最有效的 EC2 執行個體,請在查詢中包含 instance_id 欄位。

查詢

parse @message "* * * * * * * * * * * * * * * * * * * * * * * * * * *" as account_id, vpc_id, subnet_id, interface_id,instance_id, srcaddr, srcport, dstaddr, dstport, protocol, packets, bytes, action, log_status, start, end, flow_direction, traffic_path, tcp_flags, pkt_srcaddr, pkt_src_aws_service, pkt_dstaddr, pkt_dst_aws_service, region, az_id, sublocation_type, sublocation_id
| stats sum(bytes) as Data_Transferred by instance_id
| sort by Data_Transferred desc
| limit 5

輸出

instance_idData_Transferred
-1443477306
i-03205758c9203c979517558754
i-0ae33894105aa500c324629414
i-01506ab9e9e90749d198063232
i-0724007fef3cb06f354847643

篩選拒絕的 SSH 流量

若要分析您的安全群組和網路存取控制清單 (network ACL) 拒絕的流量,請使用 REJECT 篩選動作。若要識別在 SSH 流量上遭拒絕的主機,請擴展篩選條件以包括 TCP 通訊協定和目標連接埠為 22 的流量。在下列範例查詢中,使用 TCP 通訊協定 6。

查詢

parse @message "* * * * * * * * * * * * * * * * * * * * * * * * * * *" as account_id, vpc_id, subnet_id, interface_id,instance_id, srcaddr, srcport, dstaddr, dstport, protocol, packets, bytes, action, log_status, start, end, flow_direction, traffic_path, tcp_flags, pkt_srcaddr, pkt_src_aws_service, pkt_dstaddr, pkt_dst_aws_service, region, az_id, sublocation_type, sublocation_id
| filter action = "REJECT" and protocol = 6 and dstport = 22
| stats sum(bytes) as SSH_Traffic_Volume by srcaddr
| sort by SSH_Traffic_Volume desc
| limit 2

輸出

srcaddrSSH_Traffic_Volume
23.95.222.129160
179.43.167.7480

隔離特定來源/目的地配對的 HTTP 資料串流

若要分析資料中的趨勢,請使用 CloudWatch Logs Insights 來隔離兩個 IP 位址之間的雙向流量。在下列查詢中,["172.31.1.247","172.31.11.212"] 使用 IP 位址作為來源或目的地 IP 位址來傳回流程日誌。篩選語句將 VPC 流程日誌事件與 TCP 通訊協定 6 和連接埠 80 配對,以隔離 HTTP 流量。若要傳回所有可用欄位的子集,請使用 display 關鍵字。

查詢

查看下列查詢:

#HTTP Data Stream for Specific Source/Destination Pair
parse @message "* * * * * * * * * * * * * * * * * * * * * * * * * * *" as account_id, vpc_id, subnet_id, interface_id,instance_id, srcaddr, srcport, dstaddr, dstport, protocol, packets, bytes, action, log_status, start, end, flow_direction, traffic_path, tcp_flags, pkt_srcaddr, pkt_src_aws_service, pkt_dstaddr, pkt_dst_aws_service, region, az_id, sublocation_type, sublocation_id
| filter srcaddr in ["172.31.1.247","172.31.11.212"] and dstaddr in ["172.31.1.247","172.31.11.212"] and protocol = 6 and (dstport = 80 or srcport=80)
| display interface_id,srcaddr, srcport, dstaddr, dstport, protocol, bytes, action, log_status, start, end, flow_direction, tcp_flags
| sort by start desc
| limit 2

輸出

interface_idsrcaddrsrcportdstaddrdstportprotocolbytesactionlog_status
eni-0b74120275654905e172.31.11.21280172.31.1.2472937665160876ACCEPTOK
eni-0b74120275654905e172.31.1.24729376172.31.11.21280697380ACCEPTOK

將結果視覺化為長條圖或圓形圖

您可以使用 CloudWatch Log Insights 將結果視覺化為長條圖或圓形圖。如果結果包含 bin() 函數,則查詢輸出會傳回且帶有時間標記。然後,您可以使用直線或堆疊面積圖視覺化時間序列

若要計算以 1 分鐘間隔傳輸的累計資料,請使用 stats sum(bytes) as Data_Trasferred by bin(1m)。若要查看此視覺化,請在 CloudWatch Logs Insights 主控台中的日誌視覺化表格之間切換。

查詢

parse @message "* * * * * * * * * * * * * * * * * * * * * * * * * * *" as account_id, vpc_id, subnet_id, interface_id,instance_id, srcaddr, srcport, dstaddr, dstport, protocol, packets, bytes, action, log_status, start, end, flow_direction, traffic_path, tcp_flags, pkt_srcaddr, pkt_src_aws_service, pkt_dstaddr, pkt_dst_aws_service, region, az_id, sublocation_type, sublocation_id
| filter srcaddr in ["172.31.1.247","172.31.11.212"] and dstaddr in ["172.31.1.247","172.31.11.212"] and protocol = 6 and (dstport = 80 or srcport=80)
| stats sum(bytes) as Data_Transferred by bin(1m)

輸出

bin(1m)Data_Transferred
2022-04-01 15:23:00.00017225787
2022-04-01 15:21:00.00017724499
2022-04-01 15:20:00.0001125500
2022-04-01 15:19:00.000101525
2022-04-01 15:18:00.00081376

相關資訊

支援的日誌和發現的欄位

CloudWatch Logs Insights 查詢語法

AWS 官方
AWS 官方已更新 1 年前