0 votes
1 view
in Big Data Hadoop & Spark by (11.5k points)

In Scala I can flatten a collection using :

val array = Array(List("1,2,3").iterator,List("1,4,5").iterator)
                                                  //> array  : Array[Iterator[String]] = Array(non-empty iterator, non-empty itera
                                                  //| tor)


    array.toList.flatten                      //> res0: List[String] = List(1,2,3, 1,4,5)


But how can I perform similar in Spark ?

1 Answer

0 votes
by (31.4k points)

Simply use flatMap method with an identity map function (y => y). Follow the code given below:

val x = sc.parallelize(List(List("a"), List("b"), List("c", "d")))

x.collect()

x.flatMap(y => y)

x.flatMap(y => y).collect()

image

Welcome to Intellipaat Community. Get your technical queries answered by top developers !


Categories

...