Is that possible to pipe Spark RDD to Python?
Because I need a python library to do some calculation on my data, but my main Spark project is based on Scala. Is there a way to mix them both or let python access the same spark context?
You can indeed pipe out to a python script using Scala and Spark and a regular Python script.
test.py
#!/usr/bin/python
import sys
for line in sys.stdin:
print "hello " + line
spark-shell (scala)
val data = List("john","paul","george","ringo")
val dataRDD = sc.makeRDD(data)
val scriptPath = "./test.py"
val pipeRDD = dataRDD.pipe(scriptPath)
pipeRDD.foreach(println)
Output
hello john
hello ringo
hello george
hello paul