Should Spark on YARN example include --addJars?

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

Should Spark on YARN example include --addJars?

Sandy Ryza
Hey All,

I ran into an issue when trying to run SparkPi as described in the Spark on
YARN doc.

14/01/18 10:52:09 ERROR spark.SparkContext: Error adding jar
(java.io.FileNotFoundException:
spark-examples-assembly-0.9.0-incubating-SNAPSHOT.jar (No such file or
directory)), was the --addJars option used?

Is addJars not needed here?

Here's the doc:

SPARK_JAR=./assembly/target/scala-2.9.3/spark-assembly-0.8.1-incubating-hadoop2.0.5-alpha.jar
\
    ./spark-class org.apache.spark.deploy.yarn.Client \
      --jar examples/target/scala-2.9.3/spark-examples-assembly-0.8.1-incubating.jar
\
      --class org.apache.spark.examples.SparkPi \
      --args yarn-standalone \
      --num-workers 3 \
      --master-memory 4g \
      --worker-memory 2g \
      --worker-cores 1


thanks,
Sandy