在进程之间共享只读对象

2024-06-01 20:49:39 发布

您现在位置:Python中文网/ 问答频道 /正文

我在我的软件中使用了一个“保持活动”的进程模型(它与主进程的管道通信),并且我试图在它们和主进程之间共享只读对象。举个例子来说明我的问题:

from multiprocessing import Process, Pipe#, Manager
from multiprocessing.connection import wait
import os


def start_in_oneshot_processes(obj, nb_process):
    """
    Start nb_process processes to do the job. Then process finish the job they die.
    """
    processes = []
    for i in range(nb_process):
        #  Simple process style
        p = Process(target=oneshot_in_process, args=(obj,))
        p.start()
        processes.append(p)

    for process in processes:
        # Wait all process finish
        process.join()

def oneshot_in_process(obj):
    """
    Main job (don't matter if in oneshot, keeped alive process. It have job to do)
    """
    print('p', obj, os.getpid())


def start_in_keepedalive_processes(obj, nb_process):
    """
    Start nb_process and keep them alive. Send job to them multiple times, then close thems.
    """
    processes = []
    readers_pipes = []
    writers_pipes = []
    for i in range(nb_process):
        # Start process with Pipes for communicate
        local_read_pipe, local_write_pipe = Pipe(duplex=False)
        process_read_pipe, process_write_pipe = Pipe(duplex=False)
        readers_pipes.append(local_read_pipe)
        writers_pipes.append(process_write_pipe)
        p = Process(target=run_keepedalive_process, args=(local_write_pipe, process_read_pipe, obj))
        p.start()
        processes.append(p)
    # Send to process some job to do
    for job in range(3):
        print('send new job to processes:')
        for process_number in range(nb_process):
            # Send data to process
            writers_pipes[process_number].send(obj)
            reader_useds = []
        # Wait response from processes
        while readers_pipes:
            for r in wait(readers_pipes):
                try:
                    r.recv()
                except EOFError:
                    pass
                finally:
                    reader_useds.append(r)
                    readers_pipes.remove(r)
        readers_pipes = reader_useds

    # Kill processes
    for writer_pipe in writers_pipes:
        writer_pipe.send('stop')

def run_keepedalive_process(main_write_pipe, process_read_pipe, obj):
    """
    Procees who don't finish while job to do
    """
    while obj != 'stop':
        oneshot_in_process(obj)
        # Send to main process "I've done my job"
        main_write_pipe.send('job is done')
        # Wait for new job to do (this part can be simplified no ?)
        readers = [process_read_pipe]
        while readers:
            for r in wait(readers):
                try:
                    obj = r.recv()
                except EOFError:
                    pass
                finally:
                    readers.remove(r)


obj = object()
print('m', obj, os.getpid())

print('One shot processes:')
start_in_oneshot_processes(obj, 5)

print('Keeped alive processes:')
start_in_keepedalive_processes(obj, 5)

print('f', obj, os.getpid())

输出为:

^{pr2}$

如果我创建简单的进程(在单次进程中启动),obj在子进程和主进程中具有相同的内存地址:0xb7266dc8。在

但是当我的进程接收到带有管道的对象(keepadlive_进程中的start_)时,对象内存地址与主进程不同:例如:0xb7266488而不是0xb7266dc8。 这些对象由子进程只读。如何在主进程和子进程之间共享它们(并节省内存复制时间)?在


Tags: toinobjfor进程jobprocessstart
1条回答
网友
1楼 · 发布于 2024-06-01 20:49:39

使用Pipe无法达到您想要的效果。当数据通过管道发送到另一个进程时,该进程必须在它自己的地址空间中存储数据的一个副本—在Python中是一个新对象。在进程之间传送数据的大多数其他方法都是相同的。在

您从start_in_oneshot_process函数中看到了相同的内存地址,这可能是由于您选择了操作系统。一般来说,两个进程根本不会共享同一个RAM。(检查Process文档部分,了解Windowsspawn和Unixfork之间的区别

如果两个进程可以检查同一个内存块这一点很重要,那么可以尝试shared memory or an object manager process。请注意,这些相同的文档告诉您,这很少是一个好主意。在

相关问题 更多 >