This is both a general question about java EOF exceptions, as well as Hadoop's EOF exception which is related to jar interoperability. Comments and answers on either topic are acceptable.
Background
I'm noting some threads which discuss a cryptic exception, which is ultimately caused by a "readInt" method. This exception seems to have some generic implications which are independent of hadoop, but ultimately, is caused by interoperability of Hadoop jars.
In my case, I'm getting it when I try to create a new FileSystem object in hadoop, in java.
Question
My question is : What is happening and why does the reading of an integer throw an EOF exception ? What "File" is this EOF exception referring to, and why would such an exception be thrown if two jars are not capable of interoperating ?
Secondarily, I also would like to know how to fix this error so i can connect to and read/write hadoops filesystem using the hdfs protocol with the java api, remotely....
java.io.IOException: Call to /10.0.1.37:50070 failed on local exception: java.io.EOFException at org.apache.hadoop.ipc.Client.wrapException(Client.java:1139) at org.apache.hadoop.ipc.Client.call(Client.java:1107) at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226) at $Proxy0.getProtocolVersion(Unknown Source) at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:398) at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:384) at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:111) at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:213) at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:180) at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1514) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1548) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1530) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:228) at sb.HadoopRemote.main(HadoopRemote.java:35) Caused by: java.io.EOFException at java.io.DataInputStream.readInt(DataInputStream.java:375) at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:819) at org.apache.hadoop.ipc.Client$Connection.run(Client.java:720)
Regarding hadoop : I fixed the error ! You need to make sure the core-site.xml is serving to 0.0.0.0 instead of 127.0.0.1(localhost).
If you get the EOF exception, it means that the port is not accessible externally on that ip, so there is no data to read between the hadoop client / server ipc.