Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (11.4k points)

I'm trying to build a Scala jar file to run it in spark.
I'm following this tutorial.
when trying to build jar file using sbt as here, i'm facing with following error

[info] Resolving org.apache.spark#spark-core_2.10.4;1.0.2 ...
[warn]  module not found: org.apache.spark#spark-core_2.10.4;1.0.2
[warn] ==== local: tried
[warn]   /home/hduser/.ivy2/local/org.apache.spark/spark-core_2.10.4/1.0.2/ivys/ivy.xml
[warn] ==== Akka Repository: tried
[warn]   http://repo.akka.io/releases/org/apache/spark/spark-core_2.10.4/1.0.2/spark-core_2.10.4-1.0.2.pom
[warn] ==== public: tried
[warn]   http://repo1.maven.org/maven2/org/apache/spark/spark-core_2.10.4/1.0.2/spark-core_2.10.4-1.0.2.pom
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.apache.spark#spark-core_2.10.4;1.0.2: not found
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[error] {file:/home/prithvi/scala/asd/}default-d57abf/*:update: sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.10.4;1.0.2: not found
[error] Total time: 2 s, completed 13 Aug, 2014 5:24:24 PM


what's the issue and how to solve it.

1 Answer

0 votes
by (32.3k points)

After seeing your problem I found out that you have  defined your dependency as:

"org.apache.spark" %% "spark-core" % "1.0.2"

As I can see, you have used %% instructs sbt to substitute current scala version to artifact name. As far as I am concerned, spark was build for the whole family of 2.10 scala, without specific jars for 2.10.1, 2.10.2 ...

So to resolve your problem you just have to redefine it as:

"org.apache.spark" % "spark-core_2.10" % "1.0.2"

Browse Categories

...