How can I view log files in Linux and apply custom filters while viewing?

Dan picture Dan · Feb 26, 2010 · Viewed 31.8k times · Source

I need to read through some gigantic log files on a Linux system. There's a lot of clutter in the logs. At the moment I'm doing something like this:

cat logfile.txt | grep -v "IgnoreThis\|IgnoreThat" | less

But it's cumbersome -- every time I want to add another filter, I need to quit less and edit the command line. Some of the filters are relatively complicated and may be multi-line.

I'd like some way to apply filters as I am reading through the log, and a way to save these filters somewhere.

Is there a tool that can do this for me? I can't install new software so hopefully it's something that would already be installed -- e.g., less, vi, something in a Python or Perl lib, etc.

Changing the code that generates the log to generate less is not an option.

Answer

ALF picture ALF · Jul 4, 2012

Use &pattern command within less.

From the man page for less

&pattern

          Display  only  lines which match the pattern; lines which do not
          match the pattern are not displayed.  If pattern  is  empty  (if
          you  type  &  immediately  followed  by ENTER), any filtering is
          turned off, and all lines are displayed.  While filtering is  in
          effect,  an  ampersand  is  displayed  at  the  beginning of the
          prompt, as a reminder that some lines in the file may be hidden.

          Certain characters are special as in the / command:

          ^N or !
                 Display only lines which do NOT match the pattern.

          ^R     Don't interpret regular expression  metacharacters;  that
                 is, do a simple textual comparison.