<p>不需要<code>multiprocessing</code>或<code>threading</code>来并行运行子进程,例如:</p>
<pre><code>#!/usr/bin/env python
from subprocess import Popen
# run commands in parallel
processes = [Popen("echo {i:d}; sleep 2; echo {i:d}".format(i=i), shell=True)
for i in range(5)]
# collect statuses
exitcodes = [p.wait() for p in processes]
</code></pre>
<p>它同时运行5个shell命令。注意:这里既不使用线程也不使用<code>multiprocessing</code>模块。没有必要在shell命令中添加与号和<code>&</code>:<code>Popen</code>不会等待命令完成。您需要显式调用<code>.wait()</code>。</p>
<p>这很方便,但不必使用线程从子进程收集输出:</p>
<pre><code>#!/usr/bin/env python
from multiprocessing.dummy import Pool # thread pool
from subprocess import Popen, PIPE, STDOUT
# run commands in parallel
processes = [Popen("echo {i:d}; sleep 2; echo {i:d}".format(i=i), shell=True,
stdin=PIPE, stdout=PIPE, stderr=STDOUT, close_fds=True)
for i in range(5)]
# collect output in parallel
def get_lines(process):
return process.communicate()[0].splitlines()
outputs = Pool(len(processes)).map(get_lines, processes)
</code></pre>
<p>相关:<a href="https://stackoverflow.com/a/14533902/4279">Python threading multiple bash subprocesses?</a>。</p>
<p>下面是在同一线程中同时从多个子进程获取输出的代码示例:</p>
<pre><code>#!/usr/bin/env python3
import asyncio
import sys
from asyncio.subprocess import PIPE, STDOUT
@asyncio.coroutine
def get_lines(shell_command):
p = yield from asyncio.create_subprocess_shell(shell_command,
stdin=PIPE, stdout=PIPE, stderr=STDOUT)
return (yield from p.communicate())[0].splitlines()
if sys.platform.startswith('win'):
loop = asyncio.ProactorEventLoop() # for subprocess' pipes on Windows
asyncio.set_event_loop(loop)
else:
loop = asyncio.get_event_loop()
# get commands output in parallel
coros = [get_lines('"{e}" -c "print({i:d}); import time; time.sleep({i:d})"'
.format(i=i, e=sys.executable)) for i in range(5)]
print(loop.run_until_complete(asyncio.gather(*coros)))
loop.close()
</code></pre>