Top "Apache-spark-standalone" questions

Use for question related to Apache Spark standalone deploy mode (not local mode).

What is the relationship between workers, worker instances, and executors?

In Spark Standalone mode, there are master and worker nodes. Here are few questions: Does 2 worker instance mean one worker …

apache-spark apache-spark-standalone
Apache Spark: Differences between client and cluster deploy modes

TL;DR: In a Spark Standalone cluster, what are the differences between client and cluster deploy modes? How do I …

apache-spark apache-spark-standalone
Which cluster type should I choose for Spark?

I am new to Apache Spark, and I just learned that Spark supports three types of cluster: Standalone - meaning …

apache-spark yarn mesos apache-spark-standalone
winutils spark windows installation env_variable

I am trying to install Spark 1.6.1 on windows 10 and so far I have done the following... Downloaded spark 1.6.1, unpacked to …

windows git scala apache-spark apache-spark-standalone
How do I run multiple spark applications in parallel in standalone master

Using Spark(1.6.1) standalone master, I need to run multiple applications on same spark master. All application submitted after first one, …

apache-spark config high-availability apache-spark-standalone
Running Spark driver program in Docker container - no connection back from executor to the driver?

UPDATE: The problem is resolved. The Docker image is here: docker-spark-submit I run spark-submit with a fat jar inside a …

docker apache-spark mesos apache-spark-standalone
Spark Standalone Number Executors/Cores Control

So I have a spark standalone server with 16 cores and 64GB of RAM. I have both the master and worker …

apache-spark apache-spark-standalone
Connecting to remote Spark Cluster

I'm trying to host locally a spark standalone cluster. I have two heterogeneous machines connected on a LAN. Each piece …

apache-spark pyspark cluster-computing apache-spark-standalone
How to add the "--deploy-mode cluster" option to my scala code

209/5000 Hello I want to add the option "--deploy-mode cluster" to my code scala: val sparkConf = new SparkConfig ().setMaster ("spark: //192.168.60.80:7077") Without …

scala apache-spark spark-streaming apache-spark-standalone
Spark workers stopped after driver commanded a shutdown

Basically, Master node also perform as a one of the slave. Once slave on master completed it called the SparkContext …

apache-spark apache-spark-standalone