为什么BasicEngine.RunInference停留在子进程上

时间:2019-07-01 16:56:15

标签: python tensorflow multiprocessing tensorflow-lite google-coral

我正在尝试使用多处理模块在单独的进程上执行BasicEngine对象的RunInference方法。看起来好像陷入了僵局,但我不知道为什么。我的最终目标是获取RunInference方法的CPU使用率配置文件。

我试图在带有Coral Accelerator和Python 3.5的RaspberryPi3B +上运行它。在同一进程中实例化BasicEngine时,可以正确执行RunInference方法。

from edgetpu.classification.engine import BasicEngine
import random
import numpy as np
import psutil
import time
import multiprocessing as mp

#Code inspired from https://stackoverflow.com/questions/49197916/how-to-profile-cpu-usage-of-a-python-script
def run_on_another_process(target, args=(), kwargs={}):
    worker_process = mp.Process(target=target, args=args, kwargs=kwargs)
    worker_process.start()
    p = psutil.Process(worker_process.pid)

    while worker_process.is_alive():
        #Do something in the parent, e.g., measure the CPU percentage
        print("[Parent] CPU PERCENT: " + str(p.cpu_percent()))
        time.sleep(0.1)

    worker_process.join()

#Loading model
model_path = "<path_to_model>/inception_v1_224_quant_edgetpu.tflite"
engine = BasicEngine(model_path)
print("Engine and model ready!")

#Prepare data
input_size = engine.required_input_array_size()
data = np.array([random.randint(0, 255) for _ in range(input_size)], dtype=np.uint8)

#Infernce on another process
run_on_another_process(engine.RunInference, args=(data,))

该方法似乎处于死锁状态。如果我触发KeybordInterrupt,则会出现以下错误:

Traceback (most recent call last):
  File "/<path_to_script>/script.py", line 31, in <module>
    run_on_another_process(lambda data: engine.RunInference(data), args=(data,))
  File "/<path_to_script>/script.py", line 17, in run_on_another_process
    time.sleep(0.1)
KeyboardInterrupt

然后,我得到

Error in atexit._run_exitfuncs:
Traceback (most recent call last):
  File "/usr/lib/python3.5/multiprocessing/popen_fork.py", line 29, in poll
    pid, sts = os.waitpid(self.pid, flag)
KeyboardInterrupt

任何帮助都会真正有帮助!

编辑#1:

在下面的工作示例中,在同一进程上同时调用BasicEngine和RunInference方法。父进程将打印有限数量的消息。

from edgetpu.classification.engine import BasicEngine
import random
import numpy as np
import psutil
import time
import multiprocessing as mp

def child_func():
    #Loading model
    model_path = "<path_to_model>/inception_v1_224_quant_edgetpu.tflite"
    engine = BasicEngine(model_path)
    print("[Child] Engine and model ready!")
    #Prepare data
    input_size = engine.required_input_array_size()
    data = np.array([random.randint(0, 255) for _ in range(input_size)], dtype=np.uint8)
    engine.RunInference(data)

def run_on_another_process(target, args=(), kwargs={}):
    worker_process = mp.Process(target=target, args=args, kwargs=kwargs)
    worker_process.start()
    p = psutil.Process(worker_process.pid)


    while worker_process.is_alive():
        #Do something in the parent, e.g., measure the CPU percentage
        print("[Parent] CPU PERCENT: " + str(p.cpu_percent()))
        time.sleep(0.1)

    worker_process.join()

#Run on another process
run_on_another_process(child_func)

0 个答案:

没有答案