I'm running some sample code from http://www.javaworld.com/article/3060078/big-data/big-data-messaging-with-kafka-part-1.html?page=2, and the kafkaconsumer is consuming from topic as desired, but every poll results in print (to std out) of many debug logs, which I don't want.
I have tried changing all INFO and DEBUG to ERROR (even did a grep to make sure) in /config/log4j.properties
, in particular setting log4j.logger.kafka=ERROR
, kafkaAppender, but the problem persists. I referred to How to configure logging for Kafka producers?, and adopted the solution there, but perhaps the situation is different for consumers?
The DEBUG messages all have a similar format:
[Thread-0] DEBUG org.apache.kafka.clients.consumer.internals.Fetcher - Sending fetch for partitions... to broker... (id: 0 rack: null)
and are appearing at rate of 10 every second or so (changing poll argument to 1000 or even 10000 doesn't help, I tried)
Would really appreciate any help from any expert. Thanks in advance!
Edit: Not sure if it matters, but I added BasicConfigurator.configure();
to my main method, to resolve some other error occurring previously that stopped the Consumer from even starting.
create new config xml file
src/main/resources/logback.xml
<configuration>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<logger name="org.apache.kafka" level="WARN"/>
<logger name="org.apache.kafka.common.metrics" level="WARN"/>
<root level="warn">
<appender-ref ref="STDOUT" />
</root>
</configuration>