Kafka input to logstash plugin

ScipioAfricanus picture ScipioAfricanus · Mar 15, 2017 · Viewed 11.1k times · Source

I am attempting to read from a kafka cluster of 3 servers into logstash inorder to write it to a syslog server. I have writting to syslog down but even on the logstash documentation site, I ma not able to find how to read from kafka into logstash.

Logtash Version: 5.2.2 Kafka Version: 0.10.2 scala Version: 2_11

I went and looked at the api version, LOGSTASH_CORE_PLUGIN_API = "2.1.12"

This is the config I attempted to use

input {
#    file
#    {
#        path => "/opt/logstash/NOTICE.TXT"
#
#        #DEBUG below
#        #path => "../fsdfdstt.log"
#        start_position => "beginning"
#        sincedb_path => "/dev/null"
#    }
        kafka
        {
                zk_connect => "localhost:2181"
                topic_id => "kafkatest2"
        }
}
output
{
        syslog
        {
                host => ["targetserver"]
                port => port#
        }
}

But this is the error I am getting ...

[2017-03-15T10:24:17,000][ERROR][logstash.inputs.kafka    ] Unknown setting 'zk_connect' for kafka
[2017-03-15T10:24:17,008][ERROR][logstash.inputs.kafka    ] Unknown setting 'topic_id' for kafka
[2017-03-15T10:24:17,015][ERROR][logstash.agent           ] Cannot load an invalid configuration {:reason=>"Something is wrong with your configuration."}

Also, I found some people using 'group_id' in kafka input. I am not sure if my cluster has a groupid.

Thanks, Karan

Answer

Val picture Val · Mar 15, 2017

Your kafka input config needs to be like this instead:

    kafka
    {
            bootstrap_servers => "localhost:9092"
            topics => "kafkatest2"
    }

You don't connect to Zookeeper anymore, but directly to one of your Kafka brokers. Also the topic_id setting should read topics instead.

You can find the latest configuration options at: https://www.elastic.co/guide/en/logstash/current/plugins-inputs-kafka.html