Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (11.4k points)

I'm new to Scala/Spark stack and I'm trying to figure out how to test my basic skills using SparkSql to "map" RDDs in TempTables and vice-versa.

I have 2 distinct .scala files with the same code: a simple object (with def main...) and an object extending App.

In the simple object one I get an error due to "No TypeTag available" connected to my case class Log:

object counter {
  def main(args: Array[String]) {
.
.
.
   val sqlContext = new org.apache.spark.sql.SQLContext(sc)
   import sqlContext.createSchemaRDD
   case class Log(visitatore: String, data: java.util.Date, pagina: String, count: Int)
   val log = triple.map(p => Log(p._1,p._2,p._3,p._4))
   log.registerTempTable("logs")
   val logSessioni= sqlContext.sql("SELECT visitor, data, pagina, count FROM logs")
   logSessioni.foreach(println)
}


The error at line: log.registerTempTable("logs") says "No TypeTag available for Log".

In the other file (object extends App) all works fine:

object counterApp extends App {
.
.
.
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
    import sqlContext.createSchemaRDD
    case class Log(visitatore: String, data: java.util.Date, pagina: String, count: Int)
    val log = triple.map(p => Log(p._1,p._2,p._3,p._4))
    log.registerTempTable("logs")
    val logSessioni= sqlContext.sql("SELECT visitor, data, pagina, count from logs")
    logSessioni.foreach(println)
}

1 Answer

0 votes
by (32.3k points)

Just move your case class out of the method definition

The problem is that your case class Log is defined inside of the method that it is being used. So, simply move your case class definition outside of the method and it will work. I will have to take a look at how this compiles down, but my guess is that this is more of a chicken-egg problem. The TypeTag (used for reflection) is not able to be implicitly defined as it has not been fully defined at that point. And here is the JIRA explaining this more officially that exhibit that Spark would need to use a WeakTypeTag

Browse Categories

...