python进程之间的同步

时间:2019-12-03 21:17:46

标签: python synchronization locking python-multiprocessing

我有三个并行的过程。两个handleWorker和一个aggregateSum,两个handleWorker进程将从两个连接并行接收值,而第三个进程即aggregateSum将计算两个{{1 }}。如何同步他们的工作,以便在完成两个handleWorker的值收集工作后aggregateSum开始工作?

handleWorker

此代码尝试使用from multiprocessing import Process, Queue, Value, Manager, Lock def handleWorker(port,gradients_q,done_flag,global_avg,ack_q,n,lock): s = socket.socket(socket.AF_INET, socket.SOCK_STREAM) print("Connecting to port : ", port) s.bind((TCP_IP, port)) s.listen(5) conn, addr = s.accept() print('Connection address:', addr) k=0 while 1: size = safe_recv(17,conn) size = pickle.loads(size) data = safe_recv(size,conn) local_worker_gradients = pickle.loads(data) gradients_q.put(local_worker_gradients) while(done_flag.value == 0): pass size = len(global_avg.value) size = pickle.dumps(size, pickle.HIGHEST_PROTOCOL) conn.sendall(size) conn.sendall(global_avg.value) ack_q.put(1) k=k+1 if(k==(n+1)): break conn.close() s.close() def aggregateSum(gradients_q,done_flag, global_avg, ack_q,lock): global global_avg global done_flag global global_sum while(1): global_sum = [] for i in range(MAX_NUMBER_WORKERS): local_worker_gradients = gradients_q.get() if(i == 0): global_sum = local_worker_gradients else: add_local_gradients(global_sum, local_worker_gradients) avg = average_gradients(global_sum) global_avg.value = pickle.dumps(avg, pickle.HIGHEST_PROTOCOL) done_flag.value = 1 for i in range(MAX_NUMBER_WORKERS): val = ack_q.get() done_flag.value = 0 def main(argv=None): lock = Lock() manager = Manager() global_avg = manager.Value(c_char_p, "") done_flag = manager.Value('i', 0) gradients_q = Queue() ack_q = Queue() master_process = Process(target=aggregateSum, args=(gradients_q,done_flag, global_avg, ack_q,lock)) master_process.start() port = int(sys.argv[1]) process_list = [] for i in range(MAX_NUMBER_WORKERS): process_port = port + i + 1 p = Process(target=handleWorker, args=(process_port,gradients_q,done_flag,global_avg, ack_q, n, lock)) p.start() process_list.append(p) for p in process_list: p.join() 进行同步,但是运行了几步后仍然卡住。

0 个答案:

没有答案
相关问题