Looking at your problem I suggest you specify the path of libraries used when compiling your Scala code. Usually, this can not be achieved manually, but using a build tool such as Maven or sbt will do the job. You can find a minimal sbt setup at http://spark.apache.org/docs/1.2.0/quick-start.html#self-contained-applications
You can also get this error message if have the wrong scope for spark dependency. Something like:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>${spark.version}</version>
<scope>test</scope> <!-- will not be available during compile phase -->
</dependency>
This dependency won’t work in your case. So, I would suggest it as the following:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
This will work and will not include spark in your "uberjar" which is what you will almost certainly need.