无法使用 python 连接到 *.onion 站点(<urlopen error [Errno 11001] getaddrinfo failed>)

编程入门 行业动态 更新时间:2024-10-12 16:29:38
本文介绍了无法使用 python 连接到 *.onion 站点(<urlopen error [Errno 11001] getaddrinfo failed>)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧! 问题描述

我正在尝试使用 python 访问 *.onion 站点.不过还没有成功.我已经阅读了很多 stackoverflow 问题和答案,尝试了很多不同的方法来解决这个问题:我尝试使用 Python 2.7 和 Python 3.5,尝试使用 urllib、urllib2、requests(然后我发现请求不适用于袜子)、pysocks 等,但是似乎没有任何效果.现在我只收到以下错误:

I'm trying to access *.onion sites using python. Didn't success yet, though. I've read a lot of stackoverflow questions&answers, tried a lot of different ways of resolving this problem: I tried using Python 2.7 and Python 3.5, tried using urllib, urllib2, requests (then I found out requests doesn't work with socks), pysocks, etc, but nothing seems to work. Right now I'm at the point where I only get the following Error:

> <urlopen error [Errno 11001] getaddrinfo failed>

不,我没有防火墙,是的,我有良好的互联网连接,是的,该站点确实存在.我认为问题在于它是一个 *.onion 链接.

No, I don't have a firewall, and yes, I have a good internet connection, and yes, the site does exist. I think the problem is that it's an *.onion link.

这就是我现在正在做的:

This is what I'm doing right now:

import socks import socket import urllib import urllib.request socks.set_default_proxy(socks.SOCKS5, "127.0.0.1", 9050) socket.socket = socks.socksocket r = urllib.request.urlopen("xmh57jrzrnw6insl.onion") r.read()

这就是我得到的:

--------------------------------------------------------------------------- gaierror Traceback (most recent call last) C:\Users\yella\Anaconda3\lib\urllib\request.py in do_open(self, http_class, req, **http_conn_args) 1239 try: -> 1240 h.request(req.get_method(), req.selector, req.data, headers) 1241 except OSError as err: # timeout error C:\Users\yella\Anaconda3\lib\http\client.py in request(self, method, url, body, headers) 1082 """Send a complete request to the server.""" -> 1083 self._send_request(method, url, body, headers) 1084 C:\Users\yella\Anaconda3\lib\http\client.py in _send_request(self, method, url, body, headers) 1127 body = body.encode('iso-8859-1') -> 1128 self.endheaders(body) 1129 C:\Users\yella\Anaconda3\lib\http\client.py in endheaders(self, message_body) 1078 raise CannotSendHeader() -> 1079 self._send_output(message_body) 1080 C:\Users\yella\Anaconda3\lib\http\client.py in _send_output(self, message_body) 910 --> 911 self.send(msg) 912 if message_body is not None: C:\Users\yella\Anaconda3\lib\http\client.py in send(self, data) 853 if self.auto_open: --> 854 self.connect() 855 else: C:\Users\yella\Anaconda3\lib\http\client.py in connect(self) 825 self.sock = self._create_connection( --> 826 (self.host,self.port), self.timeout, self.source_address) 827 self.sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1) C:\Users\yella\Anaconda3\lib\socket.py in create_connection(address, timeout, source_address) 692 err = None --> 693 for res in getaddrinfo(host, port, 0, SOCK_STREAM): 694 af, socktype, proto, canonname, sa = res C:\Users\yella\Anaconda3\lib\socket.py in getaddrinfo(host, port, family, type, proto, flags) 731 addrlist = [] --> 732 for res in _socket.getaddrinfo(host, port, family, type, proto, flags): 733 af, socktype, proto, canonname, sa = res gaierror: [Errno 11001] getaddrinfo failed During handling of the above exception, another exception occurred: URLError Traceback (most recent call last) <ipython-input-72-1e30353c3485> in <module>() ----> 1 r = urllib.request.urlopen("xmh57jrzrnw6insl.onion:80") 2 r.read() C:\Users\yella\Anaconda3\lib\urllib\request.py in urlopen(url, data, timeout, cafile, capath, cadefault, context) 160 else: 161 opener = _opener --> 162 return opener.open(url, data, timeout) 163 164 def install_opener(opener): C:\Users\yella\Anaconda3\lib\urllib\request.py in open(self, fullurl, data, timeout) 463 req = meth(req) 464 --> 465 response = self._open(req, data) 466 467 # post-process response C:\Users\yella\Anaconda3\lib\urllib\request.py in _open(self, req, data) 481 protocol = req.type 482 result = self._call_chain(self.handle_open, protocol, protocol + --> 483 '_open', req) 484 if result: 485 return result C:\Users\yella\Anaconda3\lib\urllib\request.py in _call_chain(self, chain, kind, meth_name, *args) 441 for handler in handlers: 442 func = getattr(handler, meth_name) --> 443 result = func(*args) 444 if result is not None: 445 return result C:\Users\yella\Anaconda3\lib\urllib\request.py in http_open(self, req) 1266 1267 def http_open(self, req): -> 1268 return self.do_open(http.client.HTTPConnection, req) 1269 1270 http_request = AbstractHTTPHandler.do_request_ C:\Users\yella\Anaconda3\lib\urllib\request.py in do_open(self, http_class, req, **http_conn_args) 1240 h.request(req.get_method(), req.selector, req.data, headers) 1241 except OSError as err: # timeout error -> 1242 raise URLError(err) 1243 r = h.getresponse() 1244 except: URLError: <urlopen error [Errno 11001] getaddrinfo failed>

我对所有这些东西都很陌生,所以我可能会遗漏一些非常简单的部分.但我将不胜感激.

I'm very new to all this stuff, so I might be missing some really simple parts. But I'll be grateful for any help.

ps:尝试访问不是 *.onion 站点时,我得到以下信息:

ps: when trying to access not an *.onion site, I get the following:

[WinError 10061] No connection could be made because the target machine actively refused it

推荐答案

我使用的是 Linux,但您提供的代码对我不起作用.从它的外观来看,DNS 解析不是通过 Tor 进行的(基于错误 11001 WSAHOST_NOT_FOUND).由于 10061(连接被拒绝)错误,我有点怀疑它实际上正在使用 Tor.

I'm on Linux but the code you supplied didn't work for me. From the looks of it the DNS resolution is not happening over Tor (based on error 11001 WSAHOST_NOT_FOUND). I'm a little suspicious that it's actually using Tor because of the 10061 (connection refused) error too.

无论如何,我能够让它工作:

In any case, I was able to get it working with this:

import urllib2 import socks from sockshandler import SocksiPyHandler opener = urllib2.build_opener(SocksiPyHandler(socks.SOCKS5, "127.0.0.1", 9050, True)) print opener.open("xmh57jrzrnw6insl.onion").read()

PySocks 在他们的文档中说:

PySocks says in their docs:

请注意,monkeypatching 可能不适用于所有标准模块或所有第三方模块,一般不推荐.Monkeypatching 通常是 Python 中的一种反模式.

Note that monkeypatching may not work for all standard modules or for all third party modules, and generally isn't recommended. Monkeypatching is usually an anti-pattern in Python.

猴子补丁是使用socket.socket = socks.socksocket.

如果可能,将 Requests 与 socks5h:// 协议处理程序一起使用对于您的代理:

If possible, use Requests with the socks5h:// protocol handlers for your proxies:

import requests import json proxies = { 'http': 'socks5h://127.0.0.1:9050', 'https': 'socks5h://127.0.0.1:9050' } data = requests.get("xmh57jrzrnw6insl.onion",proxies=proxies).text print(data)

更多推荐

无法使用 python 连接到 *.onion 站点(&lt;urlopen error [Errno 11001] getaddrinfo faile

本文发布于:2023-11-23 13:02:25,感谢您对本站的认可!
本文链接:https://www.elefans.com/category/jswz/34/1621540.html
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。
本文标签:连接到   站点   amp   lt   onion

发布评论

评论列表 (有 0 条评论)
草根站长

>www.elefans.com

编程频道|电子爱好者 - 技术资讯及电子产品介绍!