scrapy新手:教程。运行scrapy crawl dmoz时出错

时间:2013-08-29 11:04:57

标签: python scrapy

我是python的新手。我在64位Windows 7上运行python 2.7.2版64位。 我按照教程在我的机器上安装了scrapy。然后我创建了一个项目demoz。但是当我进入scrapy crawl demoz时,它会显示错误。

d:\Scrapy workspace\tutorial>scrapy crawl dmoz
2013-08-29 16:10:45+0800 [scrapy] INFO: Scrapy 0.18.1 started (bot: tutorial)
2013-08-29 16:10:45+0800 [scrapy] DEBUG: Optional features available: ssl, http1
1
2013-08-29 16:10:45+0800 [scrapy] DEBUG: Overridden settings: {'NEWSPIDER_MODULE
': 'tutorial.spiders', 'SPIDER_MODULES': ['tutorial.spiders'], 'BOT_NAME': 'tuto
rial'}
2013-08-29 16:10:45+0800 [scrapy] DEBUG: Enabled extensions: LogStats, TelnetCon
sole, CloseSpider, WebService, CoreStats, SpiderState
Traceback (most recent call last):
File "C:\Python27\lib\runpy.py", line 162, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "C:\Python27\lib\runpy.py", line 72, in _run_code
exec code in run_globals
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\cmdline.py"
, line 168, in <module>
execute()
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\cmdline.py"
, line 143, in execute
_run_print_help(parser, _run_command, cmd, args, opts)
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\cmdline.py"
, line 88, in _run_print_help
func(*a, **kw)
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\cmdline.py"
, line 150, in _run_command
cmd.run(args, opts)
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\commands\cr
awl.py", line 46, in run
spider = self.crawler.spiders.create(spname, **opts.spargs)
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\command.py"
, line 34, in crawler
self._crawler.configure()
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\crawler.py"
, line 44, in configure
self.engine = ExecutionEngine(self, self._spider_closed)  
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\core\engine
.py", line 61, in __init__
self.scheduler_cls = load_object(self.settings['SCHEDULER']) 
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\utils\misc.
py", line 40, in load_object
raise ImportError, "Error loading object '%s': %s" % (path, e)
ImportError: Error loading object 'scrapy.core.scheduler.Scheduler': No module n
amed queuelib'

我猜他们的安装有问题可以请任何人帮忙...提前致谢..

1 个答案:

答案 0 :(得分:2)

您能否在您创建的项目“demoz”或“dmoz”中验证蜘蛛的名称?

您在命令

中将“dmoz”指定为蜘蛛名称
d:\Scrapy workspace\tutorial>scrapy crawl dmoz