Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (11.4k points)

I'm trying to build a Scala/Spark project in IntelliJ Idea with the following build.sbt:

name := "try"

version := "1.0"

scalaVersion := "2.11.8"

val sparkVersion = "2.2.0"

resolvers ++= Seq(
  "apache-snapshots" at "http://repository.apache.org/snapshots/"
)

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion,
  "org.apache.spark" %% "spark-sql" % sparkVersion,
  "org.apache.spark" %% "spark-mllib" % sparkVersion,
  "org.apache.spark" %% "spark-streaming" % sparkVersion,
  "org.apache.spark" %% "spark-hive" % sparkVersion
)


and getting a bunch of warnings:

8/6/17
1:29 PM SBT project import
                [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible:
                [warn]  * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final}
                [warn]      +- org.apache.spark:spark-core_2.11:2.2.0             (depends on 3.9.9.Final)
                [warn]      +- org.apache.zookeeper:zookeeper:3.4.6               (depends on 3.6.2.Final)
                [warn]      +- org.apache.hadoop:hadoop-hdfs:2.6.5                (depends on 3.6.2.Final)
                [warn]  * commons-net:commons-net:2.2 is selected over 3.1
                [warn]      +- org.apache.spark:spark-core_2.11:2.2.0             (depends on 2.2)
                [warn]      +- org.apache.hadoop:hadoop-common:2.6.5              (depends on 3.1)
                [warn]  * com.google.guava:guava:11.0.2 is selected over {12.0.1, 16.0.1}
                [warn]      +- org.apache.hadoop:hadoop-yarn-client:2.6.5         (depends on 11.0.2)
                [warn]      +- org.apache.hadoop:hadoop-yarn-api:2.6.5            (depends on 11.0.2)
                [warn]      +- org.apache.hadoop:hadoop-yarn-common:2.6.5 


I have several, perhaps dumb, questions:

Is there a better way to structure build.sbt (add other resolvers e.g.?), so that I can get rid off the warnings?
Should I care about warnings at all?

1 Answer

0 votes
by (32.3k points)
edited by

One way to get rid off the warnings is to manually tell sbt what dependencies you prefer, for your case:

dependencyOverrides ++= Set(

  "io.netty" % "netty" % "3.9.9.Final",

  "commons-net" % "commons-net" % "2.2",

  "com.google.guava" % "guava" % "11.0.2"

)

You can also read about conflict management in sbt.

Should I care about warnings at all?

In your case I would say no, since your conflicts stem from using only spark-related artifacts released under same version. Spark is a project with big userbase and possibility of jar hell introduced due to transitive dependencies is rather low (although, technically not guaranteed).

Enroll in the Scala course in Singapore to get professionally certified.

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...