Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
in Big Data Hadoop & Spark by (11.4k points)

Attempting to run from source.

This line:

val wordCounts = textFile.flatMap(line => line.split(" ")).map(word => (word, 1)).reduceByKey((a, b) => a + b)

is throwing error

value reduceByKey is not a member of org.apache.spark.rdd.RDD[(String, Int)]

 Also for:

 val wordCounts = logData.flatMap(line => line.split(" ")).map(word => (word, 1)).reduceByKey((a, b) => a + b)

logData.flatMap(line => line.split(" ")).map(word => (word, 1)) returns a MappedRDD but I cannot find this type in

I'm running this code from Spark source so could be a classpath problem ? But required dependencies are on my classpath.

1 Answer

0 votes
by (32.3k points)

Try to import the implicit conversions from SparkContext:

import org.apache.spark.SparkContext._

They use the 'pimp up my library' pattern to add methods to RDD's of specific types. For more details, see SparkContext:1296

Also, If you are using maven on ScalaIDE, you can solve this  problem by updating the dependency from spark-streaming version 1.2 to version 1.3.

Related questions

Browse Categories