Scrapy not working -


just question newbie scrapy...

the code below not working...

    import scrapy  class stackspider(scrapy.spider): name = 'stack' start_urls = [     'http://stackoverflow.com' ] cookies_enabled = false item = stackitem()   def parse(self, response):     self.crawlgeneral(response)  def crawlgeneral(self, response):            sel in response.xpath('//*[@id="question-mini-list"]'):         self.item['name'] = sel.xpath('div/div[2]/h3/a').extract()     yield self.item 

however, working fine if don't make function...

class stackspider(scrapy.spider): name = 'stack' start_urls = [     'http://stackoverflow.com' ] cookies_enabled = false item = stackitem()   def parse(self, response):     sel in response.xpath('//*[@id="question-mini-list"]'):         self.item['name'] = sel.xpath('div/div[2]/h3/a').extract()     yield self.item     self.crawlgeneral(response)  def crawlgeneral(self, response):     print 'now in!... ' 

anyone can figure out what's wrong upper code? trying make nicer separating them functions...

you forgot return.

def parse(self, response):     return self.crawlgeneral(response) 

Comments

Popular posts from this blog

c++ - QTextObjectInterface with Qml TextEdit (QQuickTextEdit) -

javascript - angular ng-required radio button not toggling required off in firefox 33, OK in chrome -

xcode - Swift Playground - Files are not readable -