Celery和Rabbitmq:如何让外部工作者从队列中使用任务

时间:2018-04-08 10:31:32

标签: django rabbitmq celery

我有2台服务器A(生产者)& B(消费者)。

服务器A有一个celery_beat任务,每30秒运行一次。

服务器A正在使用Rabbitmq 3.5.7和芹菜4.1.0。

服务器B正在使用芹菜4.1.0

服务器A可以将作业发送到tasks队列,但是服务器B的celery_worker不会接收它们。它返回以下警告并删除所有作业。

当从任一服务器shell运行任务时,它们工作正常我认为问题出在我的芹菜工作者配置上。

这是我从服务器B的celery_worker.logs

获得的错误
[2018-04-08 10:07:01,083: WARNING/MainProcess] Received and deleted unknown message.  Wrong destination?!?

The full contents of the message body was: body: '{"user": "Lewis", "status": "submitted", "start": "21-10-2017T21:08:04Z+00:00", "end": "21-10-2017T21:08:04Z+00:00", "profile_data": {"delivery_method": "ftp", "transcode_settings": "test", "delivery_settings": {"test": "test"}, "name": "Amazon", "contact_details": "test"}, "id": 78, "video_data": {"segment_data": {"segment_data": {"segment_4": {"end": "00:00:05:00", "start": "00:00:00:00"}, "segment_3": {"end": "00:00:05:00", "start": "00:15:00:00"}, "segment_1": {"end": "00:00:05:00", "start": "00:10:00:00"}, "segment_2": {"end": "00:00:05:00", "start": "00:05:00:00"}}}, "material_id": "LB000002", "total_duration": "00:00:20:00", "audio_tracks": {"en": [1, 2]}, "resolution": "SD169", "number_of_segments": 4}}' (720b)
{content_type:None content_encoding:None
  delivery_info:{'redelivered': False, 'routing_key': 'tasks', 'consumer_tag': 'None8', 'exchange': '', 'delivery_tag': 46} headers={}}

我经历过Celery和& Rabbitmq的文档和我很难过,任何帮助都会受到很大的影响。

以下是settings.pysystemd守护进程。

服务器A settings.py

CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Europe/London'
CELERY_QUEUES = {
    Queue('tasks', Exchange('tasks', routing_key='tasks'))
}
CELERY_ROUTES = {
    'task.tasks.schedule_task': {
        'queue': 'tasks',
        'routing_key': 'tasks',
    },
}
CELERY_BEAT_SCHEDULE = {
    'get_node_status': {
        'task': 'node_management.tasks.get_node_status',
        'schedule': 15,
    },
    'schedule_task': {
        'task': 'task.tasks.schedule_task',
        'schedule': 30,
    }
}

服务器A task_scheduler.service

[Unit]
Description=celery_beat
After=network.target

[Service]
Type=fork
User=ubuntu
Restart=always
WorkingDirectory=/vagrant
ExecStart=/home/ubuntu/venv/mediahub_core/bin/celery beat -A mediahub_core --loglevel=debug --logfile=/var/log/mediahub_core/celery_beat.log
StandardOutput=null
RestartSec=5
TimeoutStartSec=10
TimeoutStopSec=600
SendSIGKILL=yes

[Install]
WantedBy=multi-user.target

服务器A celery_worker.service

[Unit]
Description=celery_worker
After=network.target

[Service]
Type=fork
User=ubuntu
Restart=always
WorkingDirectory=/vagrant
ExecStart=/home/ubuntu/venv/mediahub_core/bin/celery -A mediahub_core worker --loglevel=info --logfile=/var/log/mediahub_core/celery_worker.log
StandardOutput=null
RestartSec=5
TimeoutStartSec=10
TimeoutStopSec=600
SendSIGKILL=yes

[Install]
WantedBy=multi-user.target

服务器B settings.py

CELERY_BROKER_URL = 'amqp://node:transcode@192.168.10.191'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Europe/London'
CELERY_QUEUES = {
    Queue('tasks', Exchange('tasks', routing_key='tasks'))
}
CELERY_ROUTES = {
    'task.tasks.process_task': {
        'queue': 'tasks',
        'routing_key': 'tasks',
    },
}
CELERY_BEAT_SCHEDULE = {
    'schedule_task': {
        'task': 'transcode.tasks.process_task',
        'schedule': 10,
    },
}

服务器B celery_worker.service

[Unit]
Description=celery_worker
After=network.target

[Service]
Type=fork
User=ubuntu
Restart=always
WorkingDirectory=/vagrant
ExecStart=/home/ubuntu/venv/mediahub_node/bin/celery -A mediahub_node worker -Q tasks --loglevel=debug --logfile=/var/log/mediahub_node/celery_worker.log
RestartSec=5
TimeoutStartSec=10
TimeoutStopSec=600
SendSIGKILL=yes

[Install]
WantedBy=multi-user.target

0 个答案:

没有答案