Hadoop : java.io.IOException: Call to localhost/127.0.0.1:54310 failed on local exception: java.io.EOFException

Harry picture Harry · Aug 5, 2014 · Viewed 7.3k times · Source

Am new to hadoop, Today only i started with it, I want to write the file to hdfs hadoop server, Am using the server hadoop 1.2.1, When i give jps command in cli am able to see all the nodes are running,

31895 Jps
29419 SecondaryNameNode
29745 TaskTracker
29257 DataNode

This is my sample client code to write the file to hdfs system

public static void main(String[] args) 
   {
        try {
          //1. Get the instance of COnfiguration
          Configuration configuration = new Configuration();
          configuration.addResource(new Path("/data/WorkArea/hadoop/hadoop-1.2.1/hadoop-1.2.1/conf/core-site.xml"));
          configuration.addResource(new Path("/data/WorkArea/hadoop/hadoop-1.2.1/hadoop-1.2.1/conf/hdfs-site.xml"));
          //2. Create an InputStream to read the data from local file
          InputStream inputStream = new BufferedInputStream(new FileInputStream("/home/local/PAYODA/hariprasanth.l/Desktop/ProjectionTest"));
          //3. Get the HDFS instance
          FileSystem hdfs = FileSystem.get(new URI("hdfs://localhost:54310"), configuration);
          //4. Open a OutputStream to write the data, this can be obtained from the FileSytem
          OutputStream outputStream = hdfs.create(new Path("hdfs://localhost:54310/user/hadoop/Hadoop_File.txt"),
          new Progressable() {  
                  @Override
                  public void progress() {
             System.out.println("....");
                  }
                        });
          try
          {
            IOUtils.copyBytes(inputStream, outputStream, 4096, false); 
          }
          finally
          {
            IOUtils.closeStream(inputStream);
            IOUtils.closeStream(outputStream);
          } 
       } catch (Exception e) {
           e.printStackTrace();
       }
   }

My Exception while running the code,

java.io.IOException: Call to localhost/127.0.0.1:54310 failed on local exception: java.io.EOFException
at org.apache.hadoop.ipc.Client.wrapException(Client.java:1063)
at org.apache.hadoop.ipc.Client.call(Client.java:1031)
at org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:198)
at com.sun.proxy.$Proxy0.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.WritableRpcEngine.getProxy(WritableRpcEngine.java:235)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:275)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:249)
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:163)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:283)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:247)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:109)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1792)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:76)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1826)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1808)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:265)
at com.test.hadoop.writefiles.FileWriter.main(FileWriter.java:27)
Caused by: java.io.EOFException
at java.io.DataInputStream.readInt(DataInputStream.java:392)
at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:760)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:698)

When i debug it, The error happens in the line when i try to connect to hdfs local server,

  FileSystem hdfs = FileSystem.get(new URI("hdfs://localhost:54310"), configuration);

As fas as I googled, It shows that am mis-matching the version,

Server version of hadoop is - 1.2.1 Client jar am using are

hadoop-common-0.22.0.jar
hadoop-hdfs-0.22.0.jar

Please tell me the problem, ASAP,

If possible recommend the place where can i find the client jars for hadoop, name the jars too... please...

Regards, Hari

Answer

Harry picture Harry · Aug 5, 2014

It is because of the same class representation in different jar (i.e) hadoop commons and hadoop core having the same class. Actually I got confused of using the corresponding jars.

Finally I ended up using the apache hadoop core. It works like a fly.