python多处理的内存使用情况

时间:2012-10-16 23:05:34

标签: python multiprocessing

一旦进程加入,由multiprocessing.Process生成的进程所消耗的内存是否会被释放?

我想到的场景大致如下:

from multiprocessing import Process
from multiprocessing import Queue
import time
import os

def main():
  tasks = Queue()  
  for task in [1, 18, 1, 2, 5, 2]:
    tasks.put(task)

  num_proc = 3           # this many workers @ each point in time
  procs = []
  for j in range(num_proc):
     p = Process(target = run_q, args = (tasks,))  
     procs.append(p)
     p.start()

  # joines a worker once he's done
  while procs:
    for p in procs:
        if not p.is_alive():
            p.join()        # what happens to the memory allocated by run()?  
            procs.remove(p)
            print p, len(procs)
    time.sleep(1)  

def run_q(task_q):
    while not task_q.empty():  # while's stuff to do, keep working
        task = task_q.get()
        run(task)

def run(x):       # do real work, allocates memory
    print x, os.getpid()
    time.sleep(3*x)


if __name__ == "__main__":
  main()  

在实际代码中,tasks的长度远远大于CPU核心的数量,每个task都是轻量级的,不同的任务占用了大量不同的CPU时间(几分钟到几天)不同的内存量(从花生到几GB)。所有这些内存都是run的本地内存,并且不需要共享它 - 所以问题是它是否在run返回后释放,和/或一旦进程加入。

1 个答案:

答案 0 :(得分:3)

进程终止时释放进程占用的内存。在您的示例中,当run_q()返回时会发生这种情况。