0 votes
1 view
in Big Data Hadoop & Spark by (11.5k points)

I am trying to run a spark program where i have multiple jar files, if I had only one jar I am not able run. I want to add both the jar files which are in same location. I have tried the below but it shows a dependency error

spark-submit \
  --class "max" maxjar.jar Book1.csv test \
  --driver-class-path /usr/lib/spark/assembly/lib/hive-common-0.13.1-cdh​5.3.0.jar


How can i add another jar file which is in the same directory?

1 Answer

0 votes
by (32.5k points)

Specify full path for all additional jars works, as shown below:

./bin/spark-submit --class "SparkTest" --master local[*] --jars /fullpath/first.jar,/fullpath/second.jar /fullpath/your-program.jar

Or add jars in conf/spark-defaults.conf by adding lines like:

spark.driver.extraClassPath /fullpath/firs.jar:/fullpath/second.jar

spark.executor.extraClassPath /fullpath/firs.jar:/fullpath/second.jar

Welcome to Intellipaat Community. Get your technical queries answered by top developers !


Categories

...