无法建立新的连接:[Errno 111]连接被拒绝

时间:2018-07-28 12:16:16

标签: python elasticsearch scrapy elasticsearch-5 privoxy

我正在运行scrapy项目,在其中使用代理,并在抓取期间将数据插入elasticsearch中。代码还可以,我可以抓取该网站并将其插入elasticsearch中。几分钟后出现问题,我得到了回溯:

2018-07-28 07:44:58 [urllib3.connectionpool] DEBUG: Starting new HTTP connection (1): localhost:9200
2018-07-28 07:44:58 [elasticsearch] WARNING: POST http://localhost:9200/_bulk [status:N/A request:0.002s]
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/elasticsearch/connection/http_urllib3.py", line 115, in perform_request
response = self.pool.urlopen(method, url, body, retries=False, headers=self.headers, **kw)
File "/usr/local/lib/python2.7/dist-packages/urllib3/connectionpool.py", line 638, in urlopen
_stacktrace=sys.exc_info()[2]) 
File "/usr/local/lib/python2.7/dist-packages/urllib3/util/retry.py", line 343, in increment
raise six.reraise(type(error), error, _stacktrace)
File "/usr/local/lib/python2.7/dist-packages/urllib3/connectionpool.py", line 600, in urlopen
chunked=chunked)
File "/usr/local/lib/python2.7/dist-packages/urllib3/connectionpool.py", line 354, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/usr/lib/python2.7/httplib.py", line 1057, in request
self._send_request(method, url, body, headers)
File "/usr/lib/python2.7/httplib.py", line 1097, in _send_request
self.endheaders(body)
File "/usr/lib/python2.7/httplib.py", line 1053, in endheaders
self._send_output(message_body)
File "/usr/lib/python2.7/httplib.py", line 897, in _send_output
self.send(msg)
File "/usr/lib/python2.7/httplib.py", line 859, in send
self.connect()
File "/usr/local/lib/python2.7/dist-packages/urllib3/connection.py", line 196, in connect
conn = self._new_conn()
File "/usr/local/lib/python2.7/dist-packages/urllib3/connection.py", line 180, in _new_conn
self, "Failed to establish a new connection: %s" % e)
NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7f15b365b5d0>: Failed to establish a new connection: [Errno 111] Connection refused

您对这个问题有任何想法吗?正如我提到的,我可以抓取几分钟,但是突然间我得到了回溯...

0 个答案:

没有答案