Adding fields depending on event message in Logstash not working

Natsen picture Natsen · Apr 23, 2015 · Viewed 15.5k times · Source

I have ELK installed and working in my machine, but now I want to do a more complex filtering and field adding depending on event messages.

Specifically, I want to set "id_error" and "descripcio" depending on the message pattern.

I have been trying a lot of code combinations in "logstash.conf" file, but I am not able to get the expected behavior.

Can someone tell me what I am doing wrong, what I have to do or if this is not possible? Thanks in advance.

This is my "logstash.conf" file, with the last test I have made, resulting in no events captured in Kibana:

input { 
    file {
        path => "C:\xxx.log"
    }
}

filter {
    grok {
        patterns_dir => "C:\elk\patterns"
        match => [ "message", "%{ERROR2:error2}" ]
        add_field => [ "id_error", "2" ]
        add_field => [ "descripcio", "error2!!!" ]
    }
    grok {
        patterns_dir => "C:\elk\patterns"
        match => [ "message", "%{ERROR1:error1}" ]
        add_field => [ "id_error", "1" ]
        add_field => [ "descripcio", "error1!!!" ]
    }
    if ("_grokparsefailure" in [tags]) { drop {} }
}

output {
  elasticsearch {
    host => "localhost"
    protocol => "http"
    index => "xxx-%{+YYYY.MM.dd}"
  }
}

I also have tried the following code, resulting in fields "id_error" and "descripcio" with both vaules "[1,2]" and "[error1!!!,error2!!!]" respectively, in each matched event.

As "break_on_match" is set "true" by default, I expect getting only the fields behind the matching clause, but this doesn't occur.

input { 
  file {
    path => "C:\xxx.log"
  }
}

filter {
  grok {
    patterns_dir => "C:\elk\patterns"
    match => [ "message", "%{ERROR1:error1}" ]
    add_field => [ "id_error", "1" ]
    add_field => [ "descripcio", "error1!!!" ]
    match => [ "message", "%{ERROR2:error2}" ]
    add_field => [ "id_error", "2" ]
    add_field => [ "descripcio", "error2!!!" ]
  }
  if ("_grokparsefailure" in [tags]) { drop {} }
}

output {
  elasticsearch {
    host => "localhost"
    protocol => "http"
    index => "xxx-%{+YYYY.MM.dd}"
  }
}

Answer

Natsen picture Natsen · Apr 24, 2015

I have solved the problem. I get the expected results with the following code in "logstash.conf":

input { 
  file {
    path => "C:\xxx.log"
  }
}

filter {
  grok {
    patterns_dir => "C:\elk\patterns"
    match => [ "message", "%{ERROR1:error1}" ]
    match => [ "message", "%{ERROR2:error2}" ]
  }
  if [message] =~ /error1_regex/ {
    grok {
        patterns_dir => "C:\elk\patterns"
        match => [ "message", "%{ERROR1:error1}" ]
    }
    mutate {
        add_field => [ "id_error", "1" ]
        add_field => [ "descripcio", "Error1!" ]
        remove_field => [ "message" ]
        remove_field => [ "error1" ]
    }
  }
  else if [message] =~ /error2_regex/ {
    grok {
        patterns_dir => "C:\elk\patterns"
        match => [ "message", "%{ERROR2:error2}" ]
    }
    mutate {
        add_field => [ "id_error", "2" ]
        add_field => [ "descripcio", "Error2!" ]
        remove_field => [ "message" ]
        remove_field => [ "error2" ]
    }
  }
  if ("_grokparsefailure" in [tags]) { drop {} }
}

output {
  elasticsearch {
    host => "localhost"
    protocol => "http"
    index => "xxx-%{+YYYY.MM.dd}"
  }
}