芹菜没有发现任务

时间:2017-04-03 14:20:35

标签: python django celery celery-task celerybeat

我在应用程序中使用芹菜运行的任务。我在开发环境中没有压力地设置它,它与redis作为经纪人完美地合作。昨天我将代码转移到我的服务器并设置redis,但芹菜无法发现任务。代码是一样的。

我的celery_conf.py文件(最初为celery.py):

# coding: utf-8
from __future__ import absolute_import, unicode_literals

import os
from celery import Celery
from django.conf import settings


# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'vertNews.settings')
app = Celery('vertNews')

app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)


@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))

设置中的芹菜配置

# Celery Configuration

CELERY_TASK_ALWAYS_EAGER = False
CELERY_BROKER_URL = SECRETS['celery']['broker_url']
CELERY_RESULT_BACKEND = SECRETS['celery']['result_backend']
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = TIME_ZONE

__init__.py root app

# coding: utf-8
from __future__ import absolute_import, unicode_literals

from .celery_conf import app as celery_app

__all__ = ['celery_app']

我的任务

# coding=utf-8
from __future__ import unicode_literals, absolute_import

import logging
from celery.schedules import crontab
from celery.task import periodic_task
from .api import fetch_tweets, delete_tweets


logger = logging.getLogger(__name__)


@periodic_task(
    run_every=(crontab(minute=10, hour='0, 6, 12, 18, 23')),
    name="fetch_tweets_task",
    ignore_result=True)
def fetch_tweets_task():
    logger.info("Tweet download started")
    fetch_tweets()
    logger.info("Tweet download and summarization finished")


@periodic_task(
    run_every=(crontab(minute=13, hour=13)),
    name="delete_tweets_task",
    ignore_result=True)
def delete_tweets_task():
    logger.info("Tweet deletion started")
    delete_tweets()
    logger.info("Tweet deletion finished")

我在远程服务器(不工作)中运行的结果

(trendiz) kenneth@bots:~/projects/verticals-news/src$ celery -A vertNews beat -l debug
Trying import production.py settings...
celery beat v4.0.2 (latentcall) is starting.
__    -    ... __   -        _
LocalTime -> 2017-04-03 13:55:49
Configuration ->
    . broker -> redis://localhost:6379//
    . loader -> celery.loaders.app.AppLoader
    . scheduler -> celery.beat.PersistentScheduler
    . db -> celerybeat-schedule
    . logfile -> [stderr]@%DEBUG
    . maxinterval -> 5.00 minutes (300s)
[2017-04-03 13:55:49,770: DEBUG/MainProcess] Setting default socket timeout to 30
[2017-04-03 13:55:49,771: INFO/MainProcess] beat: Starting...
[2017-04-03 13:55:49,785: DEBUG/MainProcess] Current schedule:

[2017-04-03 13:55:49,785: DEBUG/MainProcess] beat: Ticking with max interval->5.00 minutes
[2017-04-03 13:55:49,785: DEBUG/MainProcess] beat: Waking up in 5.00 minutes.
当我在dev服务器(工作)中运行时

结果

LocalTime -> 2017-04-03 14:16:19
Configuration ->
    . broker -> redis://localhost:6379//
    . loader -> celery.loaders.app.AppLoader
    . scheduler -> celery.beat.PersistentScheduler
    . db -> celerybeat-schedule
    . logfile -> [stderr]@%DEBUG
    . maxinterval -> 5.00 minutes (300s)
[2017-04-03 14:16:19,919: DEBUG/MainProcess] Setting default socket timeout to 30
[2017-04-03 14:16:19,919: INFO/MainProcess] beat: Starting...
[2017-04-03 14:16:19,952: DEBUG/MainProcess] Current schedule:
<ScheduleEntry: fetch_tweets_task fetch_tweets_task() <crontab: 36 0, 6, 12, 18, 22 * * * (m/h/d/dM/MY)>
<ScheduleEntry: delete_tweets_task delete_tweets_task() <crontab: 13 13 * * * (m/h/d/dM/MY)>
[2017-04-03 14:16:19,952: DEBUG/MainProcess] beat: Ticking with max interval->5.00 minutes
[2017-04-03 14:16:19,953: DEBUG/MainProcess] beat: Waking up in 5.00 minutes.

我在两种环境中运行python 3.5和celery 4.0.2

1 个答案:

答案 0 :(得分:0)

我不知道究竟是什么问题,但清除项目中的所有* .pyc文件摆脱了问题