Pipe Docker container logs to AWS Cloud Watch

Get to know how to read your application logs in Cloudwatch...
May 27 2021 · 3 min read

Introduction 

There comes a time when you can’t simply write all your logs to log files or inside a docker container. Many platforms provide inbuilt CloudWatch agents inside them. But what if the logging library doesn’t provide it by default? Don’t need to worry about it, that said every problem comes with a solution 😃.

Drawbacks of writing logs to files and containers:

  1. While writing our application logs to files, we’ve to maintain it consistently(no doubt one can rotate it if want, on certain time period)
  2. If one writes logs to the docker container instead of writing it to files, there will always be a case when we need history logs for tracking our application’s working flow or maybe bugs sometimes. At that time docker won’t have the long-lasting history logs.

Advantage of writing logs to Cloud Watch console:

  1. AWS CloudWatch persists logs for our defined retention period(1 day, 3 days, 5 days. 7 days, 1 month…) that totally depends on us, once the retention period expires it will vanish logs automatically.

We are what we repeatedly do. Excellence, then, is not an act, but a habit. Try out Justly and start building your habits today!

Here, we’ll see how to bypass application logs to the AWS CloudWatch console with an assumption that you’re already familiar with docker deployment on the AWS EC2 instance and have a basic knowledge about AWS CloudWatch(if not then you can get more info here). You can use the same trick with any of your projects which is deployed on AWS EC2 using docker.

Note : Verify that your AWS EC2 instance has an IAM role attached with it has a permission of accessing AWS CloudWatch services, if not you can see how to allow IAM user to access cloudwatch logs.

When running docker you can add a logging driver awslogs with following.

$ docker run \
    --log-driver=awslogs \
    --log-opt awslogs-region=eu-central-1 \
    --log-opt awslogs-group=myLogGroup \
    --log-opt awslogs-stream=myLogStream
    ...

where,

  • awslogs-region = AWS REGION in which EC2 instance is deployed
  • awslogs-group = created log group in AWS CloudWatch
  • awslogs-stream = created logs stream inside AWS CloudWatch log group

    find more details on creating log groups and log streams here.

Or you can specify, which logging driver you’re going to use in your docker-compose.yml like the following.

docker-compose.yml

myService:
logging:
  driver: awslogs
    options:
      awslogs-region: eu-central-1
      awslogs-group: myLogGroup
      awslogs-stream: myLogStream

Voila! you can see AWS CloudWatch has a log group named myLogGroup and inside which log-stream myLogStream is recording your application logs.
We can also set extra options such as aws-datetime-format .

Example — Rails application logging to CloudWatch

Let’s write production logs into CloudWatch.

If you’re currently writing your logs to file, then replace the logging configuration with the following code to write logs directly to the docker container.

config/environments/production.rb

if ENV[“RAILS_LOG_TO_STDOUT”].present?
logger = ActiveSupport::Logger.new(STDOUT)
logger.formatter = config.log_formatter
config.logger = ActiveSupport::TaggedLogging.new(logger)
end

Here, STDOUT will write logs directly into the docker container instead of log files and then it will be piped to AWS cloudwatch.

Hope you get some crunchy clues about AWS CloudWatch logging directly from the docker container.

Related Useful Article 


nidhi-d image
Nidhi Davra
Web developer@canopas | Gravitated towards Web | Eager to assist


nidhi-d image
Nidhi Davra
Web developer@canopas | Gravitated towards Web | Eager to assist

canopas-logo
We build products that customers can't help but love!
Get in touch
contact-footer
Say Hello!
footer
Subscribe Here!
Follow us on
2024 Canopas Software LLP. All rights reserved.