Which Serilog sink to use for sending to Logstash?

Vagif Abilov picture Vagif Abilov · Aug 13, 2014 · Viewed 16.2k times · Source

We started using Serilog in combination with Elasticsearch, and it's a very efficient way to store structure log data (and later visualize them using tools like Kibana). However, I see the advantage of not writing log data directly to the backend but instead configure a log broker such as Logstash that can take responsibility for adding tags to log messages, selecting indexes etc. With this setup applications won't need to have knowledge of log data distribution.

With Logstash in the middle the question is what Serilog sink is best to use so Logstash can import its data without applying advanced and CPU-intensive filters. I've seen Redis mentioned as a good companion to Logstash, but Serilog doesn't have a Redis sink. Any recommendations for Serilog sink which data can be easily transferred by Logstash to an Elasticsearch index?

There is even an approach to use Elasticsearch sink first and then loopback it to Elasticsearch again after some arrangements and applying extra tags.

Answer

FantasticFiasco picture FantasticFiasco · Nov 3, 2016

The accepted answer was written before the sink Serilog.Sinks.Http existed.

Instead of logging to file and having Filebeat monitoring it, one could have the HTTP sink post log events to the Logstash HTTP input plugin. This would mean fewer moving parts on the instances where the logs where created.