Skip to content

Docker 1.13 qwatch log/events handler

Last week-end I was hacking on my metrics collector to be able to fetch docker-engine and docker-container stats.

But as my MeetUp in Berlin is called M.E.L.I.G. (Metrics,Events,Logs,Inventory and Glue), there is much more to it then metrics.

This week-end I hacked on a new tool: qwatch

It is meant to be somehow qcollect for logs and events. Inspired by the input,filter and output notion of Logstash. Even though for now I only implemented inputs and outputs.

It needs v1.25 of the docker remoteAPI and therefore docker-v1.13.


The following inputs are available:

  • DockerLogs provides a gelf-endpoint (non-compressed) so that containers can send their stdin/stdout
  • DockerEvents, which hooks into the docker events API and fetches those

Outputwise I got:

  • Log: pushes the logs to stdout
  • Elasticsearch, which indexes the logs to elasticsearch (kinda brittle in the first iteration)

As a good software engineer, I created (empty) test-files:

ok            0.058s    coverage: 0.0% of statements
ok        0.042s    coverage: 40.9% of statements
ok 0.022s    coverage: 0.0% of statements
ok     0.010s    coverage: 0.0% of statements
ok     0.073s    coverage: 0.0% of statements
ok      0.014s    coverage: 100.0% of statements
ok      0.011s    coverage: 0.0% of statements

At least I got the types already covered... Will cover more - promise!

Spin it up

OK, despite the poor coverage... Let's see what I've done so far.

The server provides the GELF endpoint and hooks into the event-API (as said above).

$ ./bin/qwatch_v0.3.3_Darwin server
Using config file: /Users/kniepbert/.qnib/qwatch.yml
Start DockerLog collector listening on port 12201

If I start a fresh container using an image that is not already present, it goes like this.

$ docker run -t --rm --name hello --log-driver gelf \
             --log-opt gelf-address=udp://localhost:12201 \
             --log-opt gelf-compression-type=none \
             debian:latest echo "Hello World on '$(date)'"
Unable to find image 'debian:latest' locally
latest: Pulling from library/debian

386a066cd84a: Already exists
Digest: sha256:c1ce85a0f7126a3b5cbf7c57676b01b37c755b9ff9e2f39ca88181c02b985724
Status: Downloaded newer image for debian:latest
Hello World on 'Sun Nov 27 13:05:49 CET 2016'

Currently I did not implement the log-transfer as broadcast (just figured), so two outputs are competing over the logs. But let's only use the elasticsearch output:


A lot of refinement to do, but I find it thrilling. All the possibilities...