Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (50.2k points)

I'm trying to run a spark application using bin/spark-submit. When I reference my application jar inside my local filesystem, it works. However, when I copied my application jar to a directory in hdfs, i get the following exception:

Warning: Skip remote jar hdfs://localhost:9000/user/hdfs/jars/simple-project-1.0-SNAPSHOT.jar. java.lang.ClassNotFoundException: com.example.SimpleApp

Here's the command:

$ ./bin/spark-submit --class com.example.SimpleApp --master local hdfs://localhost:9000/user/hdfs/jars/simple-project-1.0-SNAPSHOT.jar

I'm using hadoop version 2.6.0, spark version 1.2.1

1 Answer

0 votes
by (32.3k points)
edited by

Client mode would not support HDFS jar extraction. 

To make HDFS library accessible to spark-job , you have to run job in cluster mode.

$SPARK_HOME/bin/spark-submit \

--deploy-mode cluster \

--class <main_class> \

--master yarn-cluster \

hdfs://myhost:8020/user/root/myjar.jar

Refer to the following video tutorial if you want more information regarding the same:

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...