Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
391 views
in Technique[技术] by (71.8m points)

python 3.x - Can't get rid of blank rows in csv output

I've written a very tiny script in python scrapy to parse name, street and phone number displayed across multiple pages from yellowpage website. When I run my script i find it working smoothly. However, the only problem i encounter is the way data are getting scraped in csv output. It is always a line (row) gap between two rows. What I meant is: data are getting printed in every other row. Seeing the picture below you will get to know what I meant. If it were not for scrapy, I could have used [newline='']. But, unfortunately I am totally helpless here. How can i get rid of blank lines coming along in the csv output? Thanks in advance to take a look into it.

items.py includes:

import scrapy

class YellowpageItem(scrapy.Item):
    name = scrapy.Field()
    street = scrapy.Field()
    phone = scrapy.Field()

Here is the spider:

import scrapy

class YellowpageSpider(scrapy.Spider):
    name = "YellowpageSp"
    start_urls = ["https://www.yellowpages.com/search?search_terms=Pizza&geo_location_terms=Los%20Angeles%2C%20CA&page={0}".format(page) for page in range(2,6)]

    def parse(self, response):
        for titles in response.css('div.info'):
            name = titles.css('a.business-name span[itemprop=name]::text').extract_first()
            street = titles.css('span.street-address::text').extract_first()
            phone = titles.css('div[itemprop=telephone]::text').extract_first()
            yield {'name': name, 'street': street, 'phone':phone}

Here is how the csv output looks like:

enter image description here

Btw, the command I'm using to get csv output is:

scrapy crawl YellowpageSp -o items.csv -t csv
See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

You can fix it by creating a new FeedExporter. Change your settings.py as below

FEED_EXPORTERS = {
    'csv': 'project.exporters.FixLineCsvItemExporter',
}

create a exporters.py in your project

exporters.py

import io
import os
import six
import csv

from scrapy.contrib.exporter import CsvItemExporter
from scrapy.extensions.feedexport import IFeedStorage
from w3lib.url import file_uri_to_path
from zope.interface import implementer


@implementer(IFeedStorage)
class FixedFileFeedStorage(object):

    def __init__(self, uri):
        self.path = file_uri_to_path(uri)

    def open(self, spider):
        dirname = os.path.dirname(self.path)
        if dirname and not os.path.exists(dirname):
            os.makedirs(dirname)
        return open(self.path, 'ab')

    def store(self, file):
        file.close()


class FixLineCsvItemExporter(CsvItemExporter):

    def __init__(self, file, include_headers_line=True, join_multivalued=',', **kwargs):
        super(FixLineCsvItemExporter, self).__init__(file, include_headers_line, join_multivalued, **kwargs)
        self._configure(kwargs, dont_fail=True)
        self.stream.close()
        storage = FixedFileFeedStorage(file.name)
        file = storage.open(file.name)
        self.stream = io.TextIOWrapper(
            file,
            line_buffering=False,
            write_through=True,
            encoding=self.encoding,
            newline="",
        ) if six.PY3 else file
        self.csv_writer = csv.writer(self.stream, **kwargs)

I am on Mac, so can't test its windows behavior. But if above doesn't work then change below part of code and set newline=" "

        self.stream = io.TextIOWrapper(
            file,
            line_buffering=False,
            write_through=True,
            encoding=self.encoding,
            newline="
",
        ) if six.PY3 else file

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...