How to fix Hadoop WARNING: An illegal reflective access operation has occurred error on Ubuntu

Amalendu Kar picture Amalendu Kar · Sep 3, 2018 · Viewed 16.6k times · Source

I have installed Java openjdk version "10.0.2"and Hadoop 2.9.0 successfully. All processes are running well

hadoopusr@amalendu:~$ jps
19888 NameNode
20388 DataNode
20898 NodeManager
20343 SecondaryNameNode
20539 ResourceManager
21118 Jps

But when ever i am trying to execute any command like hdfs dfs -ls / getting this warnings

hadoopusr@amalendu:~$ hdfs dfs -ls /
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.9.0.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
18/09/04 00:29:24 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

Please help how to fix this. This is my ~/.bashrc file configuration

export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64
export HADOOP_HOME=/usr/local/hadoop
export PATH=$PATH:$HADOOP_HOME/bin
export PATH=$PATH:$HADOOP_HOME/sbin
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"

Answer

Eugene picture Eugene · Sep 3, 2018

There is nothing you can do about these warnings, this is related to jigsaw project and strong(er) encapsulation.

Basically there is some class called sun.security.krb5.Config that is part of some "module" called java.security.jgss. This module "defines" what it exports (what others can use out of it) and to whom. This also means in plain english that this is not for the public usage - don't touch it; well hadoop did, it's part of their effort to fix this. You can report this or try to upgrade hadoop, may be this is already fixed.