Is there a way to download the full log file from AWS Cloudwatch Log groups?

0

Hello,

I'm trying to browse a giant log file on Cloudwatch and it's taking me forever to keep on clicking "load older logs". Is there any way I can download the whole log events as file?

Thanks

2回答
2

There are various open source command line tools, such as jorgebastida/awslogs and TylerBrock/saw, that you might find helpful.

AWS
John
回答済み 2年前
  • I was going to link to awslogs. Seems like a Johns thing (ha!).

0

FYI, we use awslogs. Requesting the entire log group in 1 go was SLOW.

So we created a process that splits up the request into specific periods (year, month, day, hour, minute, and second) and that generates a list of awslog calls, each one targeting a specific timeframe. The generated call's output is also piped through gzip and saved with the filename representing the log group and range ... and the last step ... all of that was run through parallel, effectively handling 10 requests at a time.

Running our wu command ("Workflow Utilities" - PHP / Symfony Console runner), wu logs:get-aws-logs nginx/access.log --split 3, we end up with a file of commands to run ...

mkdir -p '/Users/richardq/data/logs/aws/nginx-access' && awslogs get 'nginx/access.log' -G -S --no-color -s '2023-04-01 00:00:00' -e '2023-04-01 23:59:59' | gzip > '/Users/richardq/data/logs/aws/nginx-access/2023-04-01.log.gz'
mkdir -p '/Users/richardq/data/logs/aws/nginx-access' && awslogs get 'nginx/access.log' -G -S --no-color -s '2023-04-02 00:00:00' -e '2023-04-02 23:59:59' | gzip > '/Users/richardq/data/logs/aws/nginx-access/2023-04-02.log.gz'
mkdir -p '/Users/richardq/data/logs/aws/nginx-access' && awslogs get 'nginx/access.log' -G -S --no-color -s '2023-04-03 00:00:00' -e '2023-04-03 23:59:59' | gzip > '/Users/richardq/data/logs/aws/nginx-access/2023-04-03.log.gz'
mkdir -p '/Users/richardq/data/logs/aws/nginx-access' && awslogs get 'nginx/access.log' -G -S --no-color -s '2023-04-04 00:00:00' -e '2023-04-04 23:59:59' | gzip > '/Users/richardq/data/logs/aws/nginx-access/2023-04-04.log.gz'
...

That list runs of commands running via parallel, makes 1 file for each day that we've asked for on a log group. Multiple log groups can be asked for, effectively allowing us to do bulk retrieval. Overall, it seems faster for us.

One benefit really is that the local file is split and gzipped and so there are many Linux commands that can natively access GZipped content without needing to have the file unzipped.

Hopefully, this may help you.

profile picture
回答済み 1年前

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ