在Celery任务执行期间如何强制记录器格式?

时间:2019-01-22 11:57:44

标签: python django logging celery

我有一些使用Python日志记录模块记录调试日志的服务。

my_service.py:

import logging

logger = logging.getLogger(__name__)

class SomeService:
    def synchronize(self):
        logger.debug('synchronizing stuff')
        external_library.call('do it')
        logger.debug('found x results')

然后,我通过celery任务使用此服务

tasks.py:

@shared_task
def synchronize_stuff():
    stuff = some_service.synchronize()

工人然后输出如下日志:

worker_1     | [2019-01-22 11:39:19,232: DEBUG/MainProcess] Task accepted: my_task_name[48d706d7-0d92-43aa-aa9d-d5db8d660af8] pid:12
worker_1     | [2019-01-22 11:39:19,237: DEBUG/ForkPoolWorker-1] Starting new HTTPS connection (1): example.com:443
worker_1     | [2019-01-22 11:39:19,839: DEBUG/ForkPoolWorker-1] https://example.com:443 "GET /api/stuff HTTP/1.1" 200 None
worker_1     | [2019-01-22 11:39:19,860: DEBUG/ForkPoolWorker-1] Processing 35
worker_1     | [2019-01-22 11:39:19,862: DEBUG/ForkPoolWorker-1] Item 35 already closed, ignoring.
worker_1     | [2019-01-22 11:39:19,863: DEBUG/ForkPoolWorker-1] Processing 36
worker_1     | [2019-01-22 11:39:19,865: DEBUG/ForkPoolWorker-1] Item 36 already closed, ignoring.
worker_1     | [2019-01-22 11:39:19,865: DEBUG/ForkPoolWorker-1] Processing 49
worker_1     | [2019-01-22 11:39:20,380: DEBUG/ForkPoolWorker-1] https://example.com:443 "GET /api/detail/49 HTTP/1.1" 200 None
worker_1     | [2019-01-22 11:39:20,429: DEBUG/ForkPoolWorker-1] Processing 50
worker_1     | [2019-01-22 11:39:20,680: DEBUG/ForkPoolWorker-1] https://example.com:443 "GET /api/detail/50 HTTP/1.1" 200 None
worker_1     | [2019-01-22 11:39:20,693: DEBUG/ForkPoolWorker-1] Processing 51
worker_1     | [2019-01-22 11:39:21,138: DEBUG/ForkPoolWorker-1] https://example.com:443 "GET /api/detail/51 HTTP/1.1" 200 None
worker_1     | [2019-01-22 11:39:21,197: INFO/ForkPoolWorker-1] Task my_task_name[48d706d7-0d92-43aa-aa9d-d5db8d660af8] succeeded in 1.9656380449960125s: None

这对于调试已经足够了,但是我想在这些日志中包括任务名称和uuid。这可以通过使用像这样的celery任务记录器来实现:

my_service.py:

from celery.utils.log import get_task_logger
logger = get_task_logger(__name__)

class SomeService:
    def synchronize(self):
        logger.debug('synchronizing stuff')
        external_library.call('do it')
        logger.debug('found x results')

在记录方面,我到底要做什么?

worker_1     | [2019-01-22 11:39:19,232: DEBUG/MainProcess] Task accepted: my_task_name[48d706d7-0d92-43aa-aa9d-d5db8d660af8] pid:12
worker_1     | [2019-01-22 11:39:19,237: DEBUG/ForkPoolWorker-1] Starting new HTTPS connection (1): example.com:443
worker_1     | [2019-01-22 11:39:19,839: DEBUG/ForkPoolWorker-1] https://example.com:443 "GET /api/stuff HTTP/1.1" 200 None
worker_1     | [2019-01-22 11:39:19,860: DEBUG/ForkPoolWorker-1] my_task_name[48d706d7-0d92-43aa-aa9d-d5db8d660af8]: Processing 35
worker_1     | [2019-01-22 11:39:19,862: DEBUG/ForkPoolWorker-1] my_task_name[48d706d7-0d92-43aa-aa9d-d5db8d660af8]: Item 35 already closed, ignoring.
worker_1     | [2019-01-22 11:39:19,863: DEBUG/ForkPoolWorker-1] my_task_name[48d706d7-0d92-43aa-aa9d-d5db8d660af8]: Processing 36
worker_1     | [2019-01-22 11:39:19,865: DEBUG/ForkPoolWorker-1] my_task_name[48d706d7-0d92-43aa-aa9d-d5db8d660af8]: Item 36 already closed, ignoring.
worker_1     | [2019-01-22 11:39:19,865: DEBUG/ForkPoolWorker-1] my_task_name[48d706d7-0d92-43aa-aa9d-d5db8d660af8]: Processing 49
worker_1     | [2019-01-22 11:39:20,380: DEBUG/ForkPoolWorker-1] https://example.com:443 "GET /api/detail/49 HTTP/1.1" 200 None
worker_1     | [2019-01-22 11:39:20,429: DEBUG/ForkPoolWorker-1] my_task_name[48d706d7-0d92-43aa-aa9d-d5db8d660af8]: Processing 50
worker_1     | [2019-01-22 11:39:20,680: DEBUG/ForkPoolWorker-1] https://example.com:443 "GET /api/detail/50 HTTP/1.1" 200 None
worker_1     | [2019-01-22 11:39:20,693: DEBUG/ForkPoolWorker-1] my_task_name[48d706d7-0d92-43aa-aa9d-d5db8d660af8]: Processing 51
worker_1     | [2019-01-22 11:39:21,138: DEBUG/ForkPoolWorker-1] https://example.com:443 "GET /api/detail/51 HTTP/1.1" 200 None
worker_1     | [2019-01-22 11:39:21,197: INFO/ForkPoolWorker-1] Task my_task_name[48d706d7-0d92-43aa-aa9d-d5db8d660af8] succeeded in 1.9656380449960125s: None

但是我有两个问题:

  1. 我不想在服务中使用celery logger。该服务甚至可以在根本没有安装Celery的环境中使用(这样就可以在日志中不包含任务名称和uuid)

  2. 在相同任务期间执行的来自外部库的日志不使用相同的记录器,因此在日志中不包括任务名称和uuid。

哪个使我想到了这个问题:是否可以在任务级别(在tasks.py中)指定(强制)记录器,无论我如何登录服务或外部库如何记录日志?这样的事情就可以了:

tasks.py:

@shared_task
def synchronize_stuff():
    logging.enforce_logger(get_task_logger(__name__))
    stuff = some_service.synchronize()
    logging.restore_logger()

另外值得注意的是,我在项目中使用Django。

谢谢!

1 个答案:

答案 0 :(得分:0)

这并不是您要找的东西。但是我遇到了类似的问题,并使用了记录过滤器解决了该问题,该过滤器应用于处理程序,该处理程序记录到不需要芹菜记录消息的服务中。我在这个问题中描述我的问题和解决方案: How can I log from my python application to splunk, if I use celery as my task scheduler?

告诉我这是否指向正确的方向...

此外,通过使用python logging.dictConfig,我也获得了很好的结果!