celery - 长时间运行进程的消息队列

时间:2018-01-21 14:01:03

标签: python django rabbitmq celery python-multiprocessing

我正在通过django 1.11.5构建一个Web服务器,它使用celery-3.1.23& rabbitmq作为消息队列管理器,将异步任务发送到许多不同的恶魔进程(具有无限循环的进程[长时间运行])。

如何为每个进程单独动态创建队列,并从守护进程内的进程'队列接收消息,异步执行某些操作,然后将结果转发到另一个“聚合器队列”,以收集&验证结果,并向用户发送响应。 (请参阅附上的ilustracion) enter image description here

到目前为止,我通过multiprocessing.connection客户端和服务器对象连接进程,并通过Process对象打开进程。

代码 - 消费者:

from multiprocessing.connection import Listener
from multiprocessing import Process

def main_process_loop(path, port_id, auth_key):

    # Initialize the action1 intance to handle work stream:
    action_handler = ActionHandler(path)


    # Initialize the infinite loop that will run the process:

    pid, auth_key_bytes = int(port_id), bytes(auth_key)
    address = ('localhost', pid)  # family is deduced to be 'AF_INET'

    while True:

        try:
            listener = Listener(address, authkey=auth_key_bytes)
            conn = listener.accept()
            input_values = conn.recv()
            listener.close()

            if input_values == None:
                raise Exception(ERR_MSG_INVALID_ARGV)

            else:
                #do something with input_values and ActionHandler

            # need to return success message to user

        except Exception as err:
            # need to return fail message to user


if __name__ == '__main__':
    # worker_processes = []
    for auth_key, port_id in PID_DICT.items():
        path = TEMPLATE_FORMAT.format(auth_key)
        p = Process(target=main_process_loop, args=(path, port_id, auth_key))
        # worker_processes.append(p)
        p.start()
    # for p in worker_processes:
    #     p.join()
    # print "all processes have been initiated"

代码 - 芹菜任务:

from multiprocessing.connection import Client
from celery import Celery

app = Celery('tasks', broker='amqp://localhost:5672//')

@app.task
def run_backend_processes(a_lst, b_lst, in_type, out_path, in_file_name):



    ARGV_FORMAT = r"IN_TYPE={0} IN_PATH={1} B_LEVEL=" + str(b_lst) + " OUT_PATH={2}"




    ##################################################


    for process in a_lst:


        pid = {
            'A': 6001,
            'B': 6002,
            'C': 6003,
            'D': 6004,
        }[process]



        file_path = os.path.join(out_path, process + "_" + in_file_name)
        argv_string = ARGV_FORMAT.format(in_type, file_path, out_path)

        address = ('localhost', int(pid))
        conn = Client(address, authkey=bytes(mxd_process))
        conn.send(str(argv_string))
        conn.close()
    return 'process succeed'

并且django的视图不是唯一的 - 使用“run_backend_processes.delay”

谢谢你, 约阿夫。

Q& A尝试:

  1. Celery parallel distributed task with multiprocessing

  2. Can a celery worker/server accept tasks from a non celery producer?

0 个答案:

没有答案