2 Answers
- Newest
- Most votes
- Most comments
1
Hi,
Elastic Beanstalk logs backup to S3 is done from time to time as a batch upload by logrotate which means, as you rightly pointed out, that the latest logs may not be in S3 when the instance terminates.
Two possible solutions:
- Configure shutdown script, which will dump logs to S3 (e.g. here) that will upload all log files to S3 when the instance is being gracefully shut down. It may not be executed if the instance crashes or fails in some abrupt way but that's a pretty rare event.
- Configure CloudWatch Log Streaming which feeds the logs to CloudWatch Logs in real time, as soon as they are generated. That means even in case of a shutdown you'll still have the latest logs in CloudWatch Logs up to the point where the awslogs agent gets terminated. It's not S3, but just as an option. Have a look here how to set it up.
0
I am using a workaround currently like this:
After editing my rotatelogs.sh file, I manually send the rotated logs to the S3 bucket. Not: You have to set required policies before using a script like this to obtain instance and environment information.
.platform/hooks/prebuild/rotatelogs.sh content:
#!/bin/sh
test -x /usr/sbin/logrotate || exit 0
/usr/sbin/logrotate --force /etc/logrotate.elasticbeanstalk.daily/logrotate.elasticbeanstalk.nginx.conf
/usr/sbin/logrotate --force /etc/logrotate.elasticbeanstalk.daily/logrotate.elasticbeanstalk.laravel.conf
# set environment name
environment_name="your_environment_name"
# CONFIGURE replace by region
region="your_region"
# get the environment ID using the AWS CLI
environment_id=$(aws elasticbeanstalk describe-environments --environment-names "$environment_name" --query "Environments[0].EnvironmentId" --output text --region "$region")
# get the instance ID using the AWS CLI
instance_id=$(aws ec2 describe-instances --query 'Reservations[0].Instances[0].InstanceId' --output text --region "$region")
source_paths=("/var/app/current/storage/logs" "/var/log/nginx/rotated")
destination_path="s3://ondeploy-logs/$environment_id/$instance_id/"
# move rotated logs to s3
for path in "${source_paths[@]}"; do
aws s3 mv "$path" "$destination_path" --recursive --exclude "*" --include "*.gz" --region "$region"
done
answered 5 months ago
Relevant content
- asked 8 hours ago
- Accepted Answerasked 2 years ago
- AWS OFFICIALUpdated 9 months ago
- AWS OFFICIALUpdated 7 months ago
- AWS OFFICIALUpdated 9 months ago
- AWS OFFICIALUpdated 9 months ago
Hi, Thank you for answer. I may consider using ' CloudWatch Log Streaming' in the future.