Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (11.4k points)

I was trying to run spark-submit and I get "Failed to find Spark assembly JAR. You need to build Spark before running this program." When I try to run spark-shell I get the same error. What I have to do in this situation.

1 Answer

0 votes
by (32.3k points)

Your Spark package doesn't include compiled Spark code. That's why you are getting such error message when you are using these scripts spark-submit and spark-shell.

You have to download one of pre-built version in "Choose a package type" section from the Spark download page.

For Windows:

If you have done your Spark installation in a directory that contains a space in its path e.g. D:\ABC Folder\Spark, you will encounter errors. So, in that scenario I will suggest you to move it to the root or another directory with no spaces.

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...