I'm building an Apache Spark Streaming application and cannot make it log to a file on the local filesystem when running it on YARN. How can achieve this?
I've set log4.properties
file so that it can successfully write to a log file in /tmp
directory on the local file system (shown below partially):
log4j.appender.file=org.apache.log4j.FileAppender
log4j.appender.file.File=/tmp/application.log
log4j.appender.file.append=false
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n
When I run my Spark application locally by using the following command:
spark-submit --class myModule.myClass --master local[2] --deploy-mode client myApp.jar
It runs fine and I can see that log messages are written to /tmp/application.log
on my local file system.
But when I run the same application via YARN, e.g.
spark-submit --class myModule.myClass --master yarn-client --name "myModule" --total-executor-cores 1 --executor-memory 1g myApp.jar
or
spark-submit --class myModule.myClass --master yarn-cluster --name "myModule" --total-executor-cores 1 --executor-memory 1g myApp.jar
I cannot see any /tmp/application.log
on the local file system of the machine that runs YARN.
What am I missing.
It looks like you'll need to append to the JVM arguments used when launching your tasks/jobs.
Try editing conf/spark-defaults.conf
as described here
spark.executor.extraJavaOptions=-Dlog4j.configuration=file:/apps/spark-1.2.0/conf/log4j.properties
spark.driver.extraJavaOptions=-Dlog4j.configuration=file:/apps/spark-1.2.0/conf/log4j.properties
Alternatively try editing conf/spark-env.sh
as described here to add the same JVM argument, although the entries in conf/spark-defaults.conf should work.
If you are still not getting any joy, you can explicitly pass the location of your log4j.properties file on the command line along with your spark-submit
like this if the file is contained within your JAR file and in the root directory of your classpath
spark-submit --class sparky.MyApp --master spark://my.host.com:7077 --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=log4j-executor.properties" myapp.jar
If the file is not on your classpath use the file:
prefix and full path like this
spark-submit ... --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=file:/apps/spark-1.2.0/conf/log4j-executor.properties" ...