Best way to aggregate multiple log files from several servers

Claes Mogren picture Claes Mogren · Sep 17, 2008 · Viewed 44.2k times · Source

I need a simple way to monitor multiple text log files distributed over a number of HP-UX servers. They are a mix of text and XML log files from several distributed legacy systems. Currently we just ssh to the servers and use tail -f and grep, but that doesn't scale when you have many logs to keep track of.

Since the logs are in different formats and just files in folders (automatically rotated when they reach a certain size) I need to both collect them remotely and parse each one differently.

My initial thought was to make a simple daemon process that I can run on each server using a custom file reader for each file type to parse it into a common format that can be exported over the network via a socket. Another viewer program running locally will connect to these sockets and show the parsed logs in some simple tabbed GUI or aggregated to a console.

What log format should I try to convert to if I am to implement it this way?

Is there some other easier way? Should I attempt to translate the log files to the log4j format to use with Chainsaw or are there better log viewers that can connect to remote sockets? Could I use BareTail as suggested in another log question? This is not a massivly distributed system and changing the current logging implementations for all applications to use UDP broadcast or put messages on a JMS queue is not an option.

Answer

mrm picture mrm · Aug 13, 2010

Probably the lightest-weight solution for real-time log watching is to use Dancer's shell in concurrent mode with tail -f:

dsh -Mac -- tail -f /var/log/apache/*.log
  • The -a is for all machine names that you've defined in ~/.dsh/machines.list
  • The -c is for concurrent running of tail
  • The -M prepends the hostname to every line of output.