MultiProcessing Pytorch Models

Hello,
I want to use MultiProcessing to train several models simultaneously. Each thread (cpu core) update the parameters of its own model. Does PyTorch support such operations?
For example, firstly I create a model list that each element is a seperate model:
net_list=[PyTorch Net for _ in range(threads) ]
After then I use MultiPocessing to start each process to update these models. Will this work for PyTorch?

Why do you need multiprocessing for that? You can easily do a bash script for that

2 Likes

Because after optimizing these networks on different threads, the weights should be exchanged. Which is convenient to implement in a main process.

the official doc about the multiprocess , but also i found some problem in it, here is the problem, if you have some idea, can you share it with me?
Thx.