Table of contents

How logging works on GOV.UK


Each machine sends its Syslog to a central logging machine ( which listens on UDP port 514.

The logging machine sends all the Syslog messages it receives to a local Logstash, which sends those logs to logs-elasticsearch.

The logging machine keeps the logs at /srv/log/year/month/date/server-name

Logstream, logship and Redis

We have a defined type in our Puppet code which uses logship to tail logfiles.

The tailed logs are sent to logs-redis machines. The logging Elasticsearch cluster has a river to pull logs from Redis.

logship provides multiple shippers. We’re using the Redis shipper and the statsd shipper (for sending Nginx status codes to Graphite).


Logstash runs on the logging machine. It listens for data on TCP and UDP ports which are specified in its config and then applies filters to the log lines.

We use grok patterns for applying filters. A useful tool to test patterns is Grok Debugger.

The output for Logstash is our logging Elasticsearch cluster.


We run a logging Elasticsearch cluster: fab production class:logs_elasticsearch hosts

We use estools to rotate Elasticsearch indices daily, and apply an alias of logs-current for the current day’s logs.


Kibana is the interface for viewing logs in Elasticsearch. We made kibana-gds as a wrapper to put Kibana behind GOV.UK Signon.

It’s deployed to the backend application servers.

There’s some documentation on useful Kibana queries for 2nd line.

This page is owned by #2ndline and needs to be reviewed