ProxyError导致url超过最大重试次数:/

2024-09-30 05:15:14 发布

您现在位置:Python中文网/ 问答频道 /正文

我想从这个网页上得到一些代理列表https://free-proxy-list.net/ 但是我陷入了这个错误,不知道如何修复它

requests.exceptions.ProxyError: HTTPSConnectionPool(host='free-proxy-list.net', port=443): Max retries exceeded with url: / (Caused by ProxyError('Cannot connect to proxy.', NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x00000278BFFA1EB0>: Failed to establish a new connection: [WinError 10060] A connection attempt failed because the connected 
party did not properly respond after a period of time, or established connection failed because connected host has failed to respond')))

顺便说一句,这是我的相关代码:

import urllib
import requests
from bs4 import BeautifulSoup
from fake_useragent import UserAgent
ua = UserAgent(cache=False)
header = {
    "User-Agent": str(ua.msie)
    }
proxy = {
    "https": "http://95.66.151.101:8080"
}
urls = "https://free-proxy-list.net/"
res = requests.get(urls, proxies=proxy)
soup = BeautifulSoup(res.text,'lxml')

我试着去搜刮其他网站,但我意识到这不是办法


Tags: tohttpsimportfreehostnetconnectionrequests
1条回答
网友
1楼 · 发布于 2024-09-30 05:15:14

我从未见过fake_useragent模块,也不知道它的用途,但我删除了它。我也不知道为什么要添加这些标题元素,但我不认为这对于您描述的任务是必要的。查看链接中的html,代理位于section id="list" >div class="container" > <tbody>。下面的代码给出了上述区域中的所有元素,并包括所有代理。如果你想得到更具体的信息,你可以修改这个

import requests
from bs4 import BeautifulSoup

urls = "https://free-proxy-list.net/"
res = requests.get(urls)
soup = BeautifulSoup(res.text,"html.parser")


tbody = soup.find("tbody")

print(tbody.prettify())

相关问题 更多 >

    热门问题