Mapping in Spark Scala -


i new spark , scala , kind of programming in general.

what want accomplish following:

i have rdd org.apache.spark.rdd.rdd**[(double, iterable[string])]**

so possible content be:

<1 , (a,b,c)> <42, (a)    > <0 , (c,d)  > 

i need transform new rdd in such way similar output to:

<1, a> <1, b> <1, c> <42, a> <0, c> <0, d> 

this has simple, tried many different ways , couldn't right.

you can use flatmapvalues:

import org.apache.spark.sparkcontext._  val r : rdd[(double, iterable[string])] = ... r.flatmapvalues(x => x) 

Comments

Popular posts from this blog

c++ - QTextObjectInterface with Qml TextEdit (QQuickTextEdit) -

xcode - Swift Playground - Files are not readable -

jboss7.x - JBoss AS 7.3 vs 7.4 and differences -