Mapping in Spark Scala -


i new spark , scala , kind of programming in general.

what want accomplish following:

i have rdd org.apache.spark.rdd.rdd**[(double, iterable[string])]**

so possible content be:

<1 , (a,b,c)> <42, (a)    > <0 , (c,d)  > 

i need transform new rdd in such way similar output to:

<1, a> <1, b> <1, c> <42, a> <0, c> <0, d> 

this has simple, tried many different ways , couldn't right.

you can use flatmapvalues:

import org.apache.spark.sparkcontext._  val r : rdd[(double, iterable[string])] = ... r.flatmapvalues(x => x) 

Comments

Popular posts from this blog

c++ - QTextObjectInterface with Qml TextEdit (QQuickTextEdit) -

javascript - angular ng-required radio button not toggling required off in firefox 33, OK in chrome -

xcode - Swift Playground - Files are not readable -