I am wondering how to create separated indexes for different logs fetched into logstash
(which were later passed onto elasticsearch
), so that in kibana
, I can define two indexes for them and discover them.
In my case, I have a few client servers (each of which is installed with filebeat
) and a centralized log server (ELK
). Each client server has different kinds of logs, e.g. redis.log
, python
logs, mongodb
logs, that I like to sort them into different indexes and stored in elasticsearch
.
Each client server also serves different purposes, e.g. databases, UIs, applications. Hence I also like to give them different index names (by changing output index in filebeat.yml
?).
In your Filebeat configuration you can use document_type
to identify the different logs that you have. Then inside of Logstash you can set the value of the type
field to control the destination index.
However before you separate your logs into different indices you should consider leaving them in a single index and using either type
or some custom field to distinguish between log types. See index vs type.
Example Filebeat prospector config:
filebeat:
prospectors:
- paths:
- /var/log/redis/*.log
document_type: redis
- paths:
- /var/log/python/*.log
document_type: python
- paths:
- /var/log/mongodb/*.log
document_type: mongodb
Example Logstash config:
input {
beats {
port => 5044
}
}
output {
# Customize elasticsearch output for Filebeat.
if [@metadata][beat] == "filebeat" {
elasticsearch {
hosts => "localhost:9200"
manage_template => false
# Use the Filebeat document_type value for the Elasticsearch index name.
index => "%{[@metadata][type]}-%{+YYYY.MM.dd}"
document_type => "log"
}
}
}