Can not use a model inside a subprocess if used a model in main process

Hi!
I am trying to run inference on 2 models in parallel - 1 in main process and 1 in subprocess. If i run the model just in subprocess (or subprocesses) it works fine. Here is a simple example:

from torch.multiprocessing import Process, Queue
import torch
import torch.nn as nn

class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.conv = nn.Conv2d(3, 1, 3)

    def forward(self, x):
        return self.conv(x)

def parallel_model(inp, model):
    with torch.no_grad():
        output = model(inp)
    print(output.shape)

if __name__ == '__main__':
    
    model = Net()
    model.eval()

    inp = torch.rand(1,3,128,128)

    p = Process(target=parallel_model, args=(inp, model))
    p.start()
    p.join()
    

However if i firstly run an inference inside main process (even if i use a different model), my model inside the suprocess will not execute inference and will just get stuck on the line of inference. Here is an example when model in subprocess doesn’t execute:

from torch.multiprocessing import Process, Queue
import torch
import torch.nn as nn

class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.conv = nn.Conv2d(3, 1, 3)

    def forward(self, x):
        return self.conv(x)

def parallel_model(inp, model):
    with torch.no_grad():
        output = model(inp)
    print(output.shape)

if __name__ == '__main__':
    
    model = Net()
    model.eval()
    
    model_1 = Net() # Add model
    model_1.eval()

    inp = torch.rand(1,3,128,128)

    with torch.no_grad():   # Score with new model
        print(model(inp))

    p = Process(target=parallel_model, args=(inp, model))
    p.start()
    p.join()
    
    

I have tried to debug it and inside the subprocess you can successfully print both model and input. When you get into the forward method of the model the process just drops.
Can somebody please explain why this works this way? And can i somehow fix it?