hadoop - How to pass string as value in mapper? -


i trying pass string value in mapper, getting error not writable. how resolve?

public void map(longwritable key, text value, context context) throws ioexception, interruptedexception {      string tempstring = value.tostring();     string[] singlerecord = tempstring.split("\t");      //using integer.parseint calculate profit     int amount = integer.parseint(singlerecord[7]);     int asset = integer.parseint(singlerecord[8]);     int salesprice = integer.parseint(singlerecord[9]);     int profit = amount*(salesprice-asset);      string valueprofit = string.valueof(profit);     string valueone = string.valueof(one);      custid.set(singlerecord[2]);     data.set(valueone + valueprofit);     context.write(custid, data);  } 

yahoo's tutorial says :
objects can marshaled or files , across network must obey particular interface, called writable, allows hadoop read , write data in serialized form transmission.

from cloudera site :
the key , value classes must serializable framework , hence must implement writable interface. additionally, key classes must implement writablecomparable interface facilitate sorting.

so need implementation of writable write value in context. hadoop ships few stock classes such intwritable. string counterpart looking text class. can used :

context.write(custid, new text(data)); 

or

text outvalue = new text(); val.set(data); context.write(custid, outvalue)    

i case, need specialized functionality in value class, may implement writable (not big deal after all). seems text enough you.


Comments

Popular posts from this blog

c++ - QTextObjectInterface with Qml TextEdit (QQuickTextEdit) -

javascript - angular ng-required radio button not toggling required off in firefox 33, OK in chrome -

xcode - Swift Playground - Files are not readable -