I want to have a model with shared memory (same model parameter, no training) across all processes, so it is necessary to use torch.multiprocessing.Process
. And in addition, I want to have duplex communicator between master and all workers, can I combine PyTorch Process and python Pipe together ?
i.e. sending pipe connection heads as argument to PyTorch Process
from torch.multiprocessing import Process
from multiprocessing import Pipe
def worker(master_connection, worker_connection):
do something
def master():
master_connection, worker_connection = Pipe()
process = Process(target=worker, args=[master_connection, worker_connection])
process.start()
process.join()