在进程启动后声明共享内存

时间:2017-08-28 17:37:47

标签: python multiprocessing

我希望能够在流程启动后创建新的multiprocessing.Valuemultiprocessing.Array。就像在这个例子中一样:

# coding: utf-8
import multiprocessing

shared = {
    'foo': multiprocessing.Value('i', 42),
}


def job(pipe):
    while True:
        shared_key = pipe.recv()
        print(shared[shared_key].value)

process_read_pipe, process_write_pipe = multiprocessing.Pipe(duplex=False)

process = multiprocessing.Process(
    target=job,
    args=(process_read_pipe, )
)
process.start()

process_write_pipe.send('foo')

shared['bar'] = multiprocessing.Value('i', 24)
process_write_pipe.send('bar')

输出继电器:

42
Process Process-1:
Traceback (most recent call last):
  File "/usr/lib/python3.5/multiprocessing/process.py", line 249, in _bootstrap
    self.run()
  File "/usr/lib/python3.5/multiprocessing/process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "/home/bux/Projets/synergine2/p.py", line 12, in job
    print(shared[shared_key].value)
KeyError: 'bar'

Process finished with exit code 0

问题在于:shared dict在启动时会被复制到process。但是,如果我在shared dict中添加一个键,则进程无法看到它。如何开始process可以了解新multiprocessing.Value('i', 24)的存在?

不能给出思想管道,因为:

  

只能通过继承在进程之间共享同步对象

有什么想法吗?

1 个答案:

答案 0 :(得分:0)

看起来您假设两个线程都可以访问shared变量。两个线程只能访问共享的[" foo"]变量。你需要分享一本字典。

以下是一个示例:Python multiprocessing: How do I share a dict among multiple processes?

相关问题