Does load state dict break the share_memory()

Hi,

I have the following code:

model.share_memory()

send_to_other_process(model) # send the model to other process

model.load_state_dict(torch.load(‘checkpoint.pth’))

will the model in the
send_to_other_process(model)
still be able to keep track of whatever changes I make in this process?

if not, how can make load_state_dict not break the model’s share_memory() property?

Thanks a lot in advance!