为Scrapy安装包依赖项

时间:2014-06-21 06:10:41

标签: python windows python-2.7 scrapy pyopenssl

因此,在用户需要为Scrapy安装的许多软件包中,我认为我遇到了pyOpenSSL问题。

当我尝试创建一个教程Scrapy项目时,我得到以下输出:

Traceback (most recent call last):
  File "C:\Python27\lib\runpy.py", line 162, in _run_module_as_main
    "__main__", fname, loader, pkg_name)
  File "C:\Python27\lib\runpy.py", line 72, in _run_code
    exec code in run_globals
  File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 168, in <module>
    execute()
  File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 122, in execute
    cmds = _get_commands_dict(settings, inproject)
  File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 46, in _get_comma
nds_dict
    cmds = _get_commands_from_module('scrapy.commands', inproject)
  File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 29, in _get_comma
nds_from_module
    for cmd in _iter_command_classes(module):
  File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 20, in _iter_comm
and_classes
    for module in walk_modules(module_name):
  File "C:\Python27\lib\site-packages\scrapy\utils\misc.py", line 68, in walk_mo
dules
    submod = import_module(fullpath)
  File "C:\Python27\lib\importlib\__init__.py", line 37, in import_module
    __import__(name)
  File "C:\Python27\lib\site-packages\scrapy\commands\bench.py", line 3, in <mod
ule>
    from scrapy.tests.mockserver import MockServer
  File "C:\Python27\lib\site-packages\scrapy\tests\mockserver.py", line 6, in <m
odule>
    from twisted.internet import reactor, defer, ssl
  File "C:\Python27\lib\site-packages\twisted\internet\ssl.py", line 59, in <mod
ule>
    from OpenSSL import SSL
  File "build\bdist.win32\egg\OpenSSL\__init__.py", line 8, in <module>
  File "build\bdist.win32\egg\OpenSSL\rand.py", line 11, in <module>
  File "build\bdist.win32\egg\OpenSSL\_util.py", line 3, in <module>
ImportError: No module named cryptography.hazmat.bindings.openssl.binding

当我搜索最后一个错误(没有名为cryptography.hazmat的模块)时,我看到了几个pyOpenSSL的提及。所以我继续尝试运行easy_install pyOpenSSL==0.14以确保它是最新版本,但是当我这样做时,我得到了这个输出:

c:\python27\include\pymath.h(22) : warning C4273: 'round' : inconsistent dll lin
kage
        C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\INCLUDE\math.h(51
6) : see previous definition of 'round'
c:\users\bk\appdata\local\temp\easy_install-tztawu\cryptography-0.4\temp\easy_in
stall-svxsjy\cffi-0.8.2\c\misc_win32.h(225) : error C2632: 'char' followed by 'b
ool' is illegal
c:\users\bk\appdata\local\temp\easy_install-tztawu\cryptography-0.4\temp\easy_in
stall-svxsjy\cffi-0.8.2\c\misc_win32.h(225) : warning C4091: 'typedef ' : ignore
d on left of 'unsigned char' when no variable is declared
c/_cffi_backend.c(5295) : warning C4146: unary minus operator applied to unsigne
d type, result still unsigned
c/_cffi_backend.c(5296) : warning C4146: unary minus operator applied to unsigne
d type, result still unsigned
c/_cffi_backend.c(5297) : warning C4146: unary minus operator applied to unsigne
d type, result still unsigned
c/_cffi_backend.c(5298) : warning C4146: unary minus operator applied to unsigne
d type, result still unsigned
error: Setup script exited with error: command '"C:\Program Files (x86)\Microsof
t Visual Studio 12.0\VC\BIN\cl.exe"' failed with exit status 2

所以我有点迷失了我需要做些什么来让Scrapy正常运行

4 个答案:

答案 0 :(得分:21)

我在Mac OS上遇到了同样的错误。

我使用openssl 0.13而不是最新版本来解决它。

easy_install pyOpenSSL==0.13

pip install pyOpenSSL==0.13

答案 1 :(得分:1)

我强烈建议您使用conda而不是pip,尤其是在使用Windows时。 在许多其他事情中,它将为您的系统获取适当的二进制文件。它使得建立一个科学的python环境(想想Scipy,Numpy,Pandas ......)变得轻而易举。

因此,请阅读Anaconda,安装Anaconda,然后执行:

conda create -n scrapyenv python=2  # creates a new py2 environment
activate scrapyenv                  # switch to the new environment
conda install scrapy                # install scrapy

只有在希望将其封装在单独的环境中时,才需要执行步骤1和步骤2。顺便说一下,如果你做conda install anaconda,将安装一整套有用的软件包。

另外,如果conda未包含pyOpenSSL,或者您不想安装anaconda,请查看教程How to install Scrapy in 64bit Windows 7的第9点。

答案 2 :(得分:0)

我遇到了同样的问题,并尝试用第一个答案解决它,但它不起作用。   最后,我删除pyOpenSSL并下载pyopenssl,setup。问题解决了。 pyopenssl的网址是: https://launchpad.net/pyopenssl

答案 3 :(得分:0)

在尝试安装pip之前,您应升级Scrapy

pip install --upgrade pip
pip install Scrapy