<p>不久前我也遇到了同样的问题。这是我的代码片段,我正在使用它,以便匿名地抓取。在</p>
<pre><code>from urllib.request import Request, urlopen
from fake_useragent import UserAgent
import random
from bs4 import BeautifulSoup
from IPython.core.display import clear_output
# Here I provide some proxies for not getting caught while scraping
ua = UserAgent() # From here we generate a random user agent
proxies = [] # Will contain proxies [ip, port]
# Main function
def main():
# Retrieve latest proxies
proxies_req = Request('https://www.sslproxies.org/')
proxies_req.add_header('User-Agent', ua.random)
proxies_doc = urlopen(proxies_req).read().decode('utf8')
soup = BeautifulSoup(proxies_doc, 'html.parser')
proxies_table = soup.find(id='proxylisttable')
# Save proxies in the array
for row in proxies_table.tbody.find_all('tr'):
proxies.append({
'ip': row.find_all('td')[0].string,
'port': row.find_all('td')[1].string
})
# Choose a random proxy
proxy_index = random_proxy()
proxy = proxies[proxy_index]
for n in range(1, 20):
req = Request('http://icanhazip.com')
req.set_proxy(proxy['ip'] + ':' + proxy['port'], 'http')
# Every 10 requests, generate a new proxy
if n % 10 == 0:
proxy_index = random_proxy()
proxy = proxies[proxy_index]
# Make the call
try:
my_ip = urlopen(req).read().decode('utf8')
print('#' + str(n) + ': ' + my_ip)
clear_output(wait = True)
except: # If error, delete this proxy and find another one
del proxies[proxy_index]
print('Proxy ' + proxy['ip'] + ':' + proxy['port'] + ' deleted.')
proxy_index = random_proxy()
proxy = proxies[proxy_index]
# Retrieve a random index proxy (we need the index to delete it if not working)
def random_proxy():
return random.randint(0, len(proxies) - 1)
if __name__ == '__main__':
main()
</code></pre>
<p>这将创建一些正在工作的代理。这个部分:</p>
^{pr2}$
<p>这将创建不同的“标题”,假装是浏览器。
最后但并非最不重要的是,只需将这些输入到request()中。在</p>
<pre><code> # Make a get request
user_agent = random.choice(user_agent_list)
headers= {'User-Agent': user_agent, "Accept-Language": "en-US, en;q=0.5"}
proxy = random.choice(proxies)
response = get("your url", headers=headers, proxies=proxy)
</code></pre>
<p>希望能解决你的问题。在</p>
<p>否则请看这里:<a href="https://www.scrapehero.com/how-to-fake-and-rotate-user-agents-using-python-3/" rel="nofollow noreferrer">https://www.scrapehero.com/how-to-fake-and-rotate-user-agents-using-python-3/</a></p>
<p>干杯</p>