在子流程之间传递用户定义的对象

时间:2019-03-03 13:16:35

标签: python multiprocessing python-multiprocessing ray

我有以下三个文件

""" main.py """
import time
import ray

from learner import Learner
from worker import Worker

ray.init()

learner = Learner.remote()
worker = Worker.remote(learner)

worker.sample.remote()

time.sleep(10)
""" worker.py """
import ray


class LocalBuffer(dict):
    def __call__(self):
        return self

@ray.remote
class Worker():
    def __init__(self, learner):
        self.local = LocalBuffer()
        self.learner = learner

    def sample(self):
        for i in range(10):
            self.local.update({
                'state': [1, 2, 3]
            })
            print(self.local)
            self.learner.update_buffer.remote(self.local)
""" learner.py """
import ray

@ray.remote
class Learner():
    def __init__(self):
        self.buffer = {}

    def update_buffer(self, local_buffer):
        print(local_buffer)
        self.buffer['state'] = local_buffer['state']

如果我删除与ray相关的所有代码,上述代码将正常工作。如果没有,则发生错误。错误消息指出statelocal_buffer中没有update_buffer。我知道由于LocalBuffer中定义的worker.py而产生错误-如果将Worker.local定义为内置dict,一切都会好起来的。但是为什么我不能使用LocalBuffer?我在这里确实需要它,我也不知道如何使它工作。

更新

我知道问题出在哪里。原因是workerlearner处于不同的进程中。并且用户定义的对象(例如self.local)不能在进程之间传递。对于此特定问题,我可以通过将self.local传递给dict时将self.local强制转换为self.learner.update_buffer来解决此问题。我试图在LocalBuffer中导入learner.py,但是没有用。也许我必须了解更多有关多处理的知识。如果有人愿意为我提供一些有用的信息,我将不胜感激。

1 个答案:

答案 0 :(得分:0)

我们必须使LocalBuffer成为光线演员,才能使其工作。以下代码可根据需要工作。

import ray

@ray.remote
class LocalBuffer(dict):
    # have to redefine these functions in order to make it work with ray
    def __getitem__(self, k):
        return super().__getitem__(k)

    def update(self, d):
        super().update(d)

    def __call__(self):
        # cannot return self since self is a ray actor
        return dict(super().items())


@ray.remote
class Worker():
    def __init__(self, learner):
        self.local = LocalBuffer.remote()
        self.learner = learner

    def sample(self):
        for i in range(10):
            id = self.local.update.remote({
                'state': [1, 2, 3]
            })
            print(ray.get(self.local.__call__.remote()))
            self.learner.update_buffer.remote(self.local)


@ray.remote
class Learner():
    def __init__(self):
        self.buffer = {}

    def update_buffer(self, local_buffer):
        print(ray.get(local_buffer.__call__.remote()))
        self.buffer['state'] = ray.get(local_buffer.__getitem__.remote('state'))
        print('learner buffer', self.buffer)

ray.init()

learner = Learner.remote()
worker = Worker.remote(learner)

ray.get(worker.sample.remote())
相关问题