错误部署scrapyd项目

时间:2016-11-21 23:35:16

标签: scrapy scrapyd

尝试执行此命令时:

scrapyd-deploy test -p project=myProject

我收到以下错误:

Traceback (most recent call last):
      File "/usr/bin/scrapyd-deploy", line 269, in <module>
        main()
      File "/usr/bin/scrapyd-deploy", line 95, in main
        egg, tmpdir = _build_egg()
      File "/usr/bin/scrapyd-deploy", line 236, in _build_egg
        retry_on_eintr(check_call, [sys.executable, 'setup.py', 'clean', '-a', 'bdist_egg', '-d', d], stdout=o, stderr=e)
      File "/usr/local/lib/python2.7/dist-packages/scrapy/utils/python.py", line 331, in retry_on_eintr
        return function(*args, **kw)
      File "/usr/lib/python2.7/subprocess.py", line 540, in check_call
        raise CalledProcessError(retcode, cmd)
    subprocess.CalledProcessError: Command '['/usr/bin/python', 'setup.py', 'clean', '-a', 'bdist_egg', '-d', '/tmp/scrapydeploy-wV3h4k']' returned non-zero exit status 1

我已经安装了scrapyd-deploy和scrapyd-client,当然还有setuptools和scrapyd。

此命令:  python setup.py clean -a bdist_egg

产生输出:

running clean
removing 'build/lib.linux-x86_64-2.7' (and everything under it)
removing 'build/bdist.linux-x86_64' (and everything under it)
'build/scripts-2.7' does not exist -- can't clean it
removing 'build'
running bdist_egg
running egg_info
writing NOAA.egg-info/PKG-INFO
writing top-level names to NOAA.egg-info/top_level.txt
writing dependency_links to NOAA.egg-info/dependency_links.txt
writing entry points to NOAA.egg-info/entry_points.txt
reading manifest file 'NOAA.egg-info/SOURCES.txt'
writing manifest file 'NOAA.egg-info/SOURCES.txt'
installing library code to build/bdist.linux-x86_64/egg
running install_lib
running build_py
creating build
creating build/lib.linux-x86_64-2.7
creating build/lib.linux-x86_64-2.7/NOAA
copying NOAA/pipelines.py -> build/lib.linux-x86_64-2.7/NOAA
copying NOAA/__init__.py -> build/lib.linux-x86_64-2.7/NOAA
copying NOAA/noaa-template.py -> build/lib.linux-x86_64-2.7/NOAA
copying NOAA/noaa-original.py -> build/lib.linux-x86_64-2.7/NOAA
copying NOAA/settings-template.py -> build/lib.linux-x86_64-2.7/NOAA
copying NOAA/tika-python.py -> build/lib.linux-x86_64-2.7/NOAA
copying NOAA/settings.py -> build/lib.linux-x86_64-2.7/NOAA
copying NOAA/mysqldb.py -> build/lib.linux-x86_64-2.7/NOAA
copying NOAA/noaa.py -> build/lib.linux-x86_64-2.7/NOAA
copying NOAA/items.py -> build/lib.linux-x86_64-2.7/NOAA
creating build/lib.linux-x86_64-2.7/NOAA/spiders
copying NOAA/spiders/spider1.py -> build/lib.linux-x86_64-2.7/NOAA/spiders
copying NOAA/spiders/spider2.py -> build/lib.linux-x86_64-2.7/NOAA/spiders
(output omitted... There are a lot of spiders)

creating build/bdist.linux-x86_64
creating build/bdist.linux-x86_64/egg
creating build/bdist.linux-x86_64/egg/NOAA
copying build/lib.linux-x86_64-2.7/NOAA/pipelines.py -> build/bdist.linux-x86_64/egg/NOAA
creating build/bdist.linux-x86_64/egg/NOAA/spiders
copying build/lib.linux-x86_64-2.7/NOAA/spiders/spider1.py -> build/bdist.linux-x86_64/egg/NOAA/spiders
copying build/lib.linux-x86_64-2.7/NOAA/spiders/spider2.py -> build/bdist.linux-x86_64/egg/NOAA/spiders
(output omitted... There are a lot of spiders)


byte-compiling build/bdist.linux-x86_64/egg/NOAA/pipelines.py to pipelines.pyc
byte-compiling build/bdist.linux-x86_64/egg/NOAA/spiders/spider1.py to spider1.pyc
byte-compiling build/bdist.linux-x86_64/egg/NOAA/spiders/spider2.py to spider2.pyc
(output omitted... There are a lot of spiders)


byte-compiling build/bdist.linux-x86_64/egg/NOAA/__init__.py to __init__.pyc
byte-compiling build/bdist.linux-x86_64/egg/NOAA/noaa-template.py to noaa-template.pyc
byte-compiling build/bdist.linux-x86_64/egg/NOAA/noaa-original.py to noaa-original.pyc
byte-compiling build/bdist.linux-x86_64/egg/NOAA/settings-template.py to settings-template.pyc
byte-compiling build/bdist.linux-x86_64/egg/NOAA/tika-python.py to tika-python.pyc
byte-compiling build/bdist.linux-x86_64/egg/NOAA/settings.py to settings.pyc
byte-compiling build/bdist.linux-x86_64/egg/NOAA/mysqldb.py to mysqldb.pyc
byte-compiling build/bdist.linux-x86_64/egg/NOAA/noaa.py to noaa.pyc
byte-compiling build/bdist.linux-x86_64/egg/NOAA/items.py to items.pyc
creating build/bdist.linux-x86_64/egg/EGG-INFO
copying NOAA.egg-info/PKG-INFO -> build/bdist.linux-x86_64/egg/EGG-INFO
copying NOAA.egg-info/SOURCES.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying NOAA.egg-info/dependency_links.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying NOAA.egg-info/entry_points.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying NOAA.egg-info/top_level.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
zip_safe flag not set; analyzing archive contents...
creating 'dist/NOAA-1.0-py2.7.egg' and adding 'build/bdist.linux-x86_64/egg' to it
removing 'build/bdist.linux-x86_64/egg' (and everything under it)

尝试使用此命令安排蜘蛛时:

curl http://localhost:6800/schedule.json -d project=myproject -d spider=spider1.py

我收到此错误:

Traceback (most recent call last):
          File "/usr/local/lib/python2.7/dist-packages/twisted/web
            req.requestReceived(command, path, version)
          File "/usr/local/lib/python2.7/dist-packages/twisted/web
            self.process()
          File "/usr/local/lib/python2.7/dist-packages/twisted/web
            self.render(resrc)
          File "/usr/local/lib/python2.7/dist-packages/twisted/web
            body = resrc.render(self)
        --- <exception caught here> ---
          File "/usr/local/lib/python2.7/dist-packages/scrapyd/web
            return JsonResource.render(self, txrequest)
          File "/usr/local/lib/python2.7/dist-packages/scrapyd/uti
            r = resource.Resource.render(self, txrequest)
          File "/usr/local/lib/python2.7/dist-packages/twisted/web
            return m(request)
          File "/usr/local/lib/python2.7/dist-packages/scrapyd/web
            spiders = get_spider_list(project)
          File "/usr/local/lib/python2.7/dist-packages/scrapyd/uti
            runner = Config().get('runner')
          File "/usr/local/lib/python2.7/dist-packages/scrapyd/con
            self.cp.read(sources)
          File "/usr/lib/python2.7/ConfigParser.py", line 305, in
            self._read(fp, filename)
          File "/usr/lib/python2.7/ConfigParser.py", line 512, in
            raise MissingSectionHeaderError(fpname, lineno, line)
        ConfigParser.MissingSectionHeaderError: File contains no s
        file: /etc/scrapyd/conf.d/twistd.pid, line: 1
        '24262'

有趣的是,如果我将相信的部分标题([section header])添加到该twistd.pid文件中,我会收到一条错误消息,说它包含的内容不是twid的pid的数值

这些问题是否相互关联?

1 个答案:

答案 0 :(得分:0)

我遇到了同样的错误..对我有用的是使用 sudo 命令。

错误可能是因为命令没有获得适当的权限。

sudo scrapyd-deploy test -p project=myProject