Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
326 views
in Technique[技术] by (71.8m points)

python - urllib.error.HTTPError: HTTP Error 403: Forbidden

I get the error "urllib.error.HTTPError: HTTP Error 403: Forbidden" when scraping certain pages, and understand that adding something like hdr = {"User-Agent': 'Mozilla/5.0"} to the header is the solution for this.

However I can't make it work when the URL's I'm trying to scrape is in a separate source file. How/where can I add the User-Agent to the code below?

from bs4 import BeautifulSoup
import urllib.request as urllib2
import time

list_open = open("source-urls.txt")
read_list = list_open.read()
line_in_list = read_list.split("
")

i = 0
for url in line_in_list:
    soup = BeautifulSoup(urllib2.urlopen(url).read(), 'html.parser')
    name = soup.find(attrs={'class': "name"})
    description = soup.find(attrs={'class': "description"})
    for text in description:
        print(name.get_text(), ';', description.get_text())
#        time.sleep(5)
    i += 1
See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

You can achieve same using requests

import requests
hdrs = {'User-Agent': 'Mozilla / 5.0 (X11 Linux x86_64) AppleWebKit / 537.36 (KHTML, like Gecko) Chrome / 52.0.2743.116 Safari / 537.36'}    
for url in line_in_list:
    resp = requests.get(url, headers=hdrs)
    soup = BeautifulSoup(resp.content, 'html.parser')
    name = soup.find(attrs={'class': "name"})
    description = soup.find(attrs={'class': "description"})
    for text in description:
        print(name.get_text(), ';', description.get_text())
#        time.sleep(5)
    i += 1

Hope it helps!


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...