How can I ensure that rotated application logs are retained on S3 during the deployment of a new Elastic Beanstalk project.

0

How can I avoid losing logs during a new deployment in Elastic Beanstalk when using the 'S3log storage' feature for rotated logs, considering that I rotate project and Nginx logs daily, and the logs may not have rotated yet when a new deployment begins on the same day ?

My Current Solution (not working): I have a bash script at the following path '.platform/hooks/prebuild/rotatelogs.sh' . Although I am forcing logs to rotate, the rotated logs are not sent to S3 immediately.

  1. rotatelogs.sh file content:
#!/bin/sh
test -x /usr/sbin/logrotate || exit 0
/usr/sbin/logrotate --force /etc/logrotate.elasticbeanstalk.daily/logrotate.elasticbeanstalk.nginx.conf
/usr/sbin/logrotate --force /etc/logrotate.elasticbeanstalk.daily/logrotate.elasticbeanstalk.laravel.conf
  1. logrotate.elasticbeanstalk.nginx.conf file content:
      /var/log/nginx/* {
        su root root
        size 10M
        rotate 14
        missingok
        compress
        notifempty
        copytruncate
        dateext
        dateformat -%Y-%m-%d-%H_%s
        olddir /var/log/nginx/rotated
      }   
  1. logrotate.elasticbeanstalk.laravel.conf file content:
      /var/app/current/storage/logs/*.log {
        su root root
        size 10M
        rotate 14
        missingok
        compress
        notifempty
        copytruncate
        dateext
        dateformat -%Y-%m-%d-%H_%s
        olddir /var/app/current/storage/logs
      }
Akadi
asked 5 months ago255 views
2 Answers
1

Hi,

Elastic Beanstalk logs backup to S3 is done from time to time as a batch upload by logrotate which means, as you rightly pointed out, that the latest logs may not be in S3 when the instance terminates.

Two possible solutions:

  1. Configure shutdown script, which will dump logs to S3 (e.g. here) that will upload all log files to S3 when the instance is being gracefully shut down. It may not be executed if the instance crashes or fails in some abrupt way but that's a pretty rare event.
  2. Configure CloudWatch Log Streaming which feeds the logs to CloudWatch Logs in real time, as soon as they are generated. That means even in case of a shutdown you'll still have the latest logs in CloudWatch Logs up to the point where the awslogs agent gets terminated. It's not S3, but just as an option. Have a look here how to set it up.
profile picture
EXPERT
answered 5 months ago
  • Hi, Thank you for answer. I may consider using ' CloudWatch Log Streaming' in the future.

0
Accepted Answer

I am using a workaround currently like this:

After editing my rotatelogs.sh file, I manually send the rotated logs to the S3 bucket. Not: You have to set required policies before using a script like this to obtain instance and environment information.

.platform/hooks/prebuild/rotatelogs.sh content:

#!/bin/sh
test -x /usr/sbin/logrotate || exit 0
/usr/sbin/logrotate --force /etc/logrotate.elasticbeanstalk.daily/logrotate.elasticbeanstalk.nginx.conf
/usr/sbin/logrotate --force /etc/logrotate.elasticbeanstalk.daily/logrotate.elasticbeanstalk.laravel.conf

# set environment name 
environment_name="your_environment_name"

# CONFIGURE replace by region
region="your_region"

# get the environment ID using the AWS CLI
environment_id=$(aws elasticbeanstalk describe-environments --environment-names "$environment_name" --query "Environments[0].EnvironmentId" --output text --region "$region")

# get the instance ID using the AWS CLI
instance_id=$(aws ec2 describe-instances --query 'Reservations[0].Instances[0].InstanceId' --output text --region "$region")

source_paths=("/var/app/current/storage/logs" "/var/log/nginx/rotated")
destination_path="s3://ondeploy-logs/$environment_id/$instance_id/"

# move rotated logs to s3
for path in "${source_paths[@]}"; do
    aws s3 mv "$path" "$destination_path" --recursive --exclude "*" --include "*.gz" --region "$region"
done
Akadi
answered 5 months ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions