Apache Spark error : Could not connect to akka.tcp://sparkMaster@

Fanooos picture Fanooos · Feb 11, 2015 · Viewed 9.6k times · Source

This is our first steps using big data stuff like apache spark and hadoop.

We have a installed Cloudera CDH 5.3. From the cloudera manager we choose to install spark. Spark is up and running very well in one of the nodes in the cluster.

From my machine I made a little application that connects to read a text file stored on hadoop HDFS.

I am trying to run the application from Eclipse and it displays these messages

15/02/11 14:44:01 INFO client.AppClient$ClientActor: Connecting to master spark://10.62.82.21:7077... 15/02/11 14:44:02 WARN client.AppClient$ClientActor: Could not connect to akka.tcp://[email protected]:7077: akka.remote.InvalidAssociation: Invalid address: akka.tcp://[email protected]:7077 15/02/11 14:44:02 WARN Remoting: Tried to associate with unreachable remote address [akka.tcp://[email protected]:7077]. Address is now gated for 5000 ms, all messages to this address will be delivered to dead letters. Reason: Connection refused: no further information: /10.62.82.21:7077

The application is has one class the create a context using the following line

JavaSparkContext sc = new JavaSparkContext(new SparkConf().setAppName("Spark Count").setMaster("spark://10.62.82.21:7077"));

where this IP is the IP of the machine spark working on.

Then I try to read a file from HDFS using the following line

sc.textFile("hdfs://10.62.82.21/tmp/words.txt")

When I run the application I got the

Answer

G Quintana picture G Quintana · Feb 11, 2015

Check your Spark master logs, you should see something like:

15/02/11 13:37:14 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkMaster@mymaster:7077]
15/02/11 13:37:14 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkMaster@mymaster:7077]
15/02/11 13:37:14 INFO Master: Starting Spark master at spark://mymaster:7077

Then when your connecting to the master, be sure to use exactly the same hostname as found in the logs above (do not use the IP address):

.setMaster("spark://mymaster:7077"));

Spark standalone is a bit picky with this hostname/IP stuff.