0 votes
1 view
in Big Data Hadoop & Spark by (11.5k points)

Once I have got in Spark some Row class, either Dataframe or Catalyst, I want to convert it to a case class in my code. This can be done by matching

someRow match {case Row(a:Long,b:String,c:Double) => myCaseClass(a,b,c)}


But it becomes ugly when the row has a huge number of columns, say a dozen of Doubles, some Booleans and even the occasional null.

I would like just to be able to -sorry- cast Row to myCaseClass. Is it possible, or have I already got the most economical syntax?

1 Answer

0 votes
by (31.4k points)

I would suggest you to import the spark.impilcits._ and perform the following commands:

image

Related questions

Welcome to Intellipaat Community. Get your technical queries answered by top developers !


Categories

...