使用VSCode调试Celery

时间:2018-11-29 09:45:51

标签: python visual-studio-code celery vscode-debugger

我正在使用VSCode与Django进行Web开发。调试Django毫无疑问,但是当我尝试使用Celery时-调试器不会在断点处停止。我将此配置用于运行Celery和Celery Beat:

{
    "name": "Beat",
    "type": "python",
    "request": "launch",
    "pythonPath": "/home/MyName/job/MyProject/venv/bin/python",
    "program": "/home/MyName/job/MyProject/venv/bin/celery",
    "console": "integratedTerminal",
    "args": [
        "-A",
        "bgp",
        "beat",
        "-l",
        "info"
    ]
},
{
    "name": "Celery",
    "type": "python",
    "request": "launch",
    "pythonPath": "/home/MyName/job/MyProject/venv/bin/python",
    "program": "/home/MyName/job/MyProject/venv/bin/celery",
    "console": "integratedTerminal",
    "args": [
        "-A",
        "bgp",
        "worker",
        "-l",
        "info",
        "-Q",
        "ssh",
        "--concurrency=1",
    ]
},  

运行Celery时-我得到此追溯:

[2018-11-29 13:18:34,112: CRITICAL/MainProcess] Unrecoverable error: RuntimeError('already started',)
Traceback (most recent call last):
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/celery/worker/worker.py", line 205, in start
    self.blueprint.start(self)
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/celery/bootsteps.py", line 119, in start
    step.start(parent)
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/celery/bootsteps.py", line 369, in start
    return self.obj.start()
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/celery/concurrency/base.py", line 131, in start
    self.on_start()
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/celery/concurrency/prefork.py", line 112, in on_start
    **self.options)
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/celery/concurrency/asynpool.py", line 432, in __init__
    super(AsynPool, self).__init__(processes, *args, **kwargs)
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/billiard/pool.py", line 1007, in __init__
    self._create_worker_process(i)
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/celery/concurrency/asynpool.py", line 449, in _create_worker_process
    return super(AsynPool, self)._create_worker_process(i)
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/billiard/pool.py", line 1116, in _create_worker_process
    w.start()
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/billiard/process.py", line 124, in start
    self._popen = self._Popen(self)
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/billiard/context.py", line 333, in _Popen
    return Popen(process_obj)
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/billiard/popen_fork.py", line 24, in __init__
    self._launch(process_obj)
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/billiard/popen_fork.py", line 72, in _launch
    self.pid = os.fork()
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/ptvsd/_vendored/pydevd/_pydev_bundle/pydev_monkey.py", line 488, in new_fork
    _on_forked_process()
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/ptvsd/_vendored/pydevd/_pydev_bundle/pydev_monkey.py", line 56, in _on_forked_process
    pydevd.settrace_forked()
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/ptvsd/_vendored/pydevd/pydevd.py", line 1723, in settrace_forked
    patch_multiprocessing=True,
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/ptvsd/_vendored/pydevd/pydevd.py", line 1488, in settrace
    stop_at_frame,
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/ptvsd/_vendored/pydevd/pydevd.py", line 1536, in _locked_settrace
    debugger.connect(host, port)  # Note: connect can raise error.
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/ptvsd/_vendored/pydevd/pydevd.py", line 484, in connect
    s = start_client(host, port)
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/ptvsd/pydevd_hooks.py", line 125, in <lambda>
    _start_client = (lambda h, p: start_client(daemon, h, p))
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/ptvsd/pydevd_hooks.py", line 71, in start_client
    sock, start_session = daemon.start_client((host, port))
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/ptvsd/daemon.py", line 208, in start_client
    with self.started():
  File "/usr/local/lib/python2.7/contextlib.py", line 17, in __enter__
    return self.gen.next()
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/ptvsd/daemon.py", line 109, in started
    self.start()
  File "/home/MyName/job/MyProject/venv/lib/python2.7/site-packages/ptvsd/daemon.py", line 144, in start
    raise RuntimeError('already started')
RuntimeError: already started
[2018-11-29 13:18:34,158: INFO/MainProcess] Connected to amqp://project:**@127.0.0.1:5672/project
[2018-11-29 13:18:34,210: INFO/MainProcess] mingle: searching for neighbors
[2018-11-29 13:18:35,292: INFO/MainProcess] mingle: all alone
[2018-11-29 13:18:35,353: WARNING/MainProcess] /home/MyName/job/MyProject/venv/lib/python2.7/site-packages/celery/fixups/django.py:200: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
  warnings.warn('Using settings.DEBUG leads to a memory leak, never '
[2018-11-29 13:18:35,354: INFO/MainProcess] celery@MyName-vm ready..  

芹菜工作正常。但是,当我在任何任务中添加断点时-Celery线程不会停止执行。我该如何解决?
我的芹菜版本:celery [redis] == 4.2.0

3 个答案:

答案 0 :(得分:3)

您可以尝试在Celery配置的参数中添加“ -P solo”。 请参阅https://github.com/Microsoft/ptvsd/issues/1046

这是我的芹菜配置。它对我来说正常工作。

{
        "name": "Python: Celery",
        "type": "python",
        "request": "launch",
        "module": "celery",
        "console": "integratedTerminal",
        "args": [
            "-A",
            "tsbc",
            "worker",
            "-l",
            "info",
            "-P",
            "solo",
        ]
    }

答案 1 :(得分:0)

这是我的配置,工作正常。

{
  "name": "Python: Django Shell",
  "type": "python",
  "request": "launch",
  "program": "${workspaceFolder}/manage.py",
  "args": [
    "shell"
  ],
  "django": true
},
{
  "name": "Python: Celery Workers",
  "type": "python",
  "request": "launch",
  "module": "celery",
  "console": "integratedTerminal",
  "envFile": "${workspaceFolder}/.env",
  "args": ["-A", "yourproject", "worker", "-l", "debug", "-Q", "queueName"]

}

答案 2 :(得分:0)

我想出了一种在 VS Code 中使用 debugpy 的方法,以获得比 Celery pdb 更好的调试体验。 我使用 docker-compose,但即使你不使用,你仍然可以使用这个想法。

所以首先用 debugpy 启动你的 celery

  celery:
    command:
      [
        "sh",
        "-c",
        "pip install debugpy -t /tmp && python /tmp/debugpy --listen 0.0.0.0:6900 -m celery -A backend.celery worker -l info",
      ]
    ports:
      - 6900:6900

然后这里是launch.json

{
    "name": "Celery: Remote Attach",
    "type": "python",
    "request": "attach",
    "connect": {
        "host": "localhost",
        "port": 6900
    },
    "pathMappings": [{
        "localRoot": "${workspaceFolder}",
        "remoteRoot": "/app"
    }],
    // "preLaunchTask": "docker-compose up",
    "django": true,
},

就是这样,享受调试! 您可以像往常一样设置断点和评估变量。 enter image description here

我有post关于这个的更多信息

相关问题