Celery任务总是在不同文件中执行PENDING任务

时间:2015-09-03 03:42:45

标签: python flask rabbitmq celery

我试图将我的任务重构为自己的文件。但是,这会导致作业状态未更新 - 它始终处于PENDING状态。但任务运行正常。

这是我的app.py

from flask import Flask, jsonify
from celery.task.control import inspect

from jobs import task_one
from factory import create_app, create_celery

app = create_app()
celery = create_celery(app)


@app.route('/run', methods=['GET'])
def run_task():
    # run job in celery
    task = task_one.run()
    return jsonify(name=app.name, status='Task is running', taskid=task.id), 202


@app.route('/status/<taskid>', methods=['GET'])
def task_status(taskid):
    task = celery.AsyncResult(taskid)
    return jsonify(status=task.state)


def main():
    app.run()


if __name__ == '__main__':
    main()

这是我的factory.py

from flask import Flask
from celery import Celery


def create_app():
    app = Flask(__name__)
    app.config['DEBUG'] = True
    app.config['CELERY_BROKER_URL'] = 'amqp://127.0.0.1'
    app.config['CELERY_RESULT_BACKEND'] = 'rpc'
    app.config['CELERY_TRACK_STARTED'] = True

    return app


def create_celery(app=None):
    app = app or create_app()
    celery = Celery(app.import_name, broker=app.config['CELERY_BROKER_URL'])
    celery.conf.update(app.config)
    TaskBase = celery.Task
    class ContextTask(TaskBase):
        abstract = True
        def __call__(self, *args, **kwargs):
            with app.app_context():
                return TaskBase.__call__(self, *args, **kwargs)
    celery.Task = ContextTask
    return celery

这是我的jobs/task_one.py

from time import sleep    
from celery import chain    
from factory import create_celery

celery = create_celery()


@celery.task(name='jobs.long_running_task', bind=True)
def long_running_task(self, x, y):
    sleep(15)
    print 'running:', x, y
    return x + y


@celery.task(name='jobs.long_mapping_task', bind=True)
def long_mapping_task(self, x, y):
    sleep(15)
    print 'mapping:', x, y
    return x + y


def run():
    task = chain(long_running_task.s(1,2), long_mapping_task.s(4))()
    return task

因此,我通过发出rabbitmq并通过celery worker -A app.celery --loglevel=debug --concurrency=1运行Flask应用程序来运行python app.py芹菜。

任务运行正常,但作业状态始终未决。

现在,如果我将所有内容都放在一个文件中,它就可以了。以下代码应该有效:

from time import sleep

from flask import Flask, jsonify
from celery import Celery, chain
from celery.task.control import inspect


app = Flask(__name__)
app.config['DEBUG'] = True
app.config['CELERY_BROKER_URL'] = 'amqp://127.0.0.1'
app.config['CELERY_RESULT_BACKEND'] = 'rpc'
app.config['CELERY_TRACK_STARTED'] = True

celery = Celery(app.name, broker=app.config['CELERY_BROKER_URL'])
celery.conf.update(app.config)


@celery.task(bind=True)
def long_running_task(self, x, y):
    sleep(15)
    print 'running:', x, y
    return x + y


@celery.task(bind=True)
def long_mapping_task(self, x, y):
    sleep(15)
    print 'mapping:', x, y
    return x + y


@app.route('/run', methods=['GET'])
def run_task():
    # run job in celery
    task = chain(long_running_task.s(1,2), long_mapping_task.s(4))()
    return jsonify(name=app.name, status='Job is running', taskid=task.id), 202

@app.route('/status/<taskid>', methods=['GET'])
def task_status(taskid):
    task = celery.AsyncResult(taskid)

    return jsonify(status=task.state)


def main():
    app.run()


if __name__ == '__main__':
    main()

我不明白为什么会这样,以及如何解决这个问题。我已经在SO中看到过这里发布的其他解决方案,但这些解决方案都不适用于我的情况。任何帮助表示赞赏。

1 个答案:

答案 0 :(得分:1)

看来你确实有2个芹菜实例:

app.py celery = create_celery(app)

jobs / task_one.py celery = create_celery()

您应该通过导入 jobs / task_one.py 来共享在 app.py 中创建的芹菜实例:

from app import celery

请注意,您可能需要移动语句from jobs import task_one以避免apptask_one模块之间的循环依赖