How to Use both Scala and Python in a same Spark project?

Wilson Liao picture Wilson Liao · Oct 6, 2015 · Viewed 19.4k times · Source

Is that possible to pipe Spark RDD to Python?

Because I need a python library to do some calculation on my data, but my main Spark project is based on Scala. Is there a way to mix them both or let python access the same spark context?

Answer

Stephen De Gennaro picture Stephen De Gennaro · Oct 6, 2015

You can indeed pipe out to a python script using Scala and Spark and a regular Python script.

test.py

#!/usr/bin/python

import sys

for line in sys.stdin:
  print "hello " + line

spark-shell (scala)

val data = List("john","paul","george","ringo")

val dataRDD = sc.makeRDD(data)

val scriptPath = "./test.py"

val pipeRDD = dataRDD.pipe(scriptPath)

pipeRDD.foreach(println)

Output

hello john

hello ringo

hello george

hello paul