Scrapy not working -
just question newbie scrapy...
the code below not working...
import scrapy class stackspider(scrapy.spider): name = 'stack' start_urls = [ 'http://stackoverflow.com' ] cookies_enabled = false item = stackitem() def parse(self, response): self.crawlgeneral(response) def crawlgeneral(self, response): sel in response.xpath('//*[@id="question-mini-list"]'): self.item['name'] = sel.xpath('div/div[2]/h3/a').extract() yield self.item
however, working fine if don't make function...
class stackspider(scrapy.spider): name = 'stack' start_urls = [ 'http://stackoverflow.com' ] cookies_enabled = false item = stackitem() def parse(self, response): sel in response.xpath('//*[@id="question-mini-list"]'): self.item['name'] = sel.xpath('div/div[2]/h3/a').extract() yield self.item self.crawlgeneral(response) def crawlgeneral(self, response): print 'now in!... '
anyone can figure out what's wrong upper code? trying make nicer separating them functions...
you forgot return.
def parse(self, response): return self.crawlgeneral(response)
Comments
Post a Comment