如何在Scrapyd中设置max_proc_per_cpu

时间:2017-12-09 20:17:17

标签: python scrapy scrapyd

我有以下两个具有以下配置的Scrapy项目

Project1的scrapy.cfg

[settings]
default = Project1.settings
[deploy]
url = http://localhost:6800/
project = Project1

[scrapyd]
eggs_dir    = eggs
logs_dir    = logs
logs_to_keep = 500
dbs_dir     = dbs
max_proc    = 5
max_proc_per_cpu = 10
http_port   = 6800
debug       = off
runner      = scrapyd.runner
application = scrapyd.app.application

和Project2的scrapy.cfg

[settings]
default = Project2.settings
[deploy]
url = http://localhost:6800/
project = Project2

[scrapyd]
eggs_dir    = eggs
logs_dir    = logs
logs_to_keep = 500
dbs_dir     = dbs
max_proc    = 5
max_proc_per_cpu = 10
http_port   = 6800
debug       = off
runner      = scrapyd.runner
application = scrapyd.app.application

但是当我查看http://localhost:6800/jobs时,我总是看到只有8个项目在运行,这意味着默认 max_proc_per_cpu 未应用,我使用以下命令删除项目

curl http://localhost:6800/delproject.json -d project=Project1

curl http://localhost:6800/delproject.json -d project=Project2

再次部署它们以确保部署新的更改。但是跑步的蜘蛛数仍然是8。

我的VPS CPU有两个核心。我可以得到它 python -c 'import multiprocessing; print(multiprocessing.cpu_count())'

如何获得Scrapyd部署配置? 如何设置每个CPU的最大进程?

0 个答案:

没有答案