subprocess.check_output without high memory usage

时间:2015-06-15 14:16:38

标签: python linux memory subprocess

In my current project I have a webserver that calls Linux commands to get information that is then displayed on the website. The problem I have with that is that the webserver runs on a tiny embedded device (it is basically a configuration tool for the device) that has only 256 MB of RAM. The webserver itself does take more than half of the free RAM that I have on that device.

Now when I try to use subprocess.check_output() to call a command the fork shortly doubles the RAM usage (because it clones the parent process or something, as far as I understand) and thus crashes the whole thing with an "Out of Memory", though the called process is quite tiny.

Since the device uses pretty cheap flash chips that have proven to fail if overused I don't want to use any swap solutions or other solutions that are based on increasing the virtual memory.

What I tried to do so far is to Popen a sh session at the start of the program when it is still low on memory usage and then write the commands to that sh session and read the output. This kinda works, but it is quite unstable, since a wrong "exit" or something therelike can crash the whole thing.

Is there any solution similar to subprocess.check_output() that doesn't double my memory usage?

1 个答案:

答案 0 :(得分:7)

所以在J.F. Sebastian的帮助下我明白了。

这是我最后使用的代码:

from multiprocessing import Process, Queue
from subprocess import check_output, CalledProcessError

def cmdloop(inQueue,outQueue):
    while True:
        command = inQueue.get()
        try:
            result = check_output(command,shell=True)
        except CalledProcessError as e:
            result = e

        outQueue.put(result)

inQueue = Queue()
outQueue = Queue()
cmdHostProcess = Process(target=cmdloop, args=(inQueue,outQueue,))
cmdHostProcess.start()

def callCommand(command):
    inQueue.put(command)
    return outQueue.get()

def killCmdHostProcess():
    cmdHostProcess.terminate()

在Python 3.4+中,我本可以使用multiprocessing.set_start_method('forkserver'),但由于这在Python 2.7上运行,所以很遗憾没有。

这仍然可以减少我的内存使用量,并以干净的方式消除问题。非常感谢您的帮助!