I am using nginx module for filebeats to send log data to elasticsearch. Here is my filebeats configuration:
output:
logstash:
enabled: true
hosts:
- logstash:5044
timeout: 15
filebeat.modules:
- module: nginx
access:
enabled: true
var.paths: ["/var/log/nginx/access.log"]
error:
enabled: true
var.paths: ["/var/log/nginx/error.log"]
The problem is that logs are not parsed. This is what I see in Kibana:
{ "_index": "filebeat-2017.07.18", "_type": "log", "_id": "AV1VLXEbhj7uWd8Fgz6M", "_version": 1, "_score": null, "_source": {
"@timestamp": "2017-07-18T10:10:24.791Z",
"offset": 65136,
"@version": "1",
"beat": {
"hostname": "06d09033fb23",
"name": "06d09033fb23",
"version": "5.5.0"
},
"input_type": "log",
"host": "06d09033fb23",
"source": "/var/log/nginx/access.log",
"message": "10.15.129.226 - - [18/Jul/2017:12:10:21 +0200] \"POST /orders-service/orders/v1/sessions/update/FUEL_DISPENSER?api_key=vgxt5u24uqyyyd9gmxzpu9n7 HTTP/1.1\" 200 5 \"-\" \"Mashery Proxy\"",
"type": "log",
"tags": [
"beats_input_codec_plain_applied"
] }, "fields": {
"@timestamp": [
1500372624791
] }, "sort": [
1500372624791 ] }
I am missing parsed fields, as specified in the documentation: https://www.elastic.co/guide/en/beats/filebeat/current/exported-fields-nginx.html
Why are log lines not parsed?
When you run filebeat -v -modules=nginx -setup
, it will essentially create 4 things:
Here are the filters for parsing:
- nginx access log
- nginx error log
The filters are stored in the ingest node. You can access them on:
http://YourElasticHost:9200/_ingest/pipeline
So if you want your logs parsed, you need to send them via the ingest node.