Unable to exchange the weights of network

Hello,

I have defined my model. In forward pass of the neural network, i want to apply some changes on the outputs and the weights of layers. I could apply my operation on the weights but i can’t replace new weights with previous ones.
To get the network’s weights i am using .state_dict().

Thank you!

This might be helpful. https://discuss.pytorch.org/t/how-to-assign-a-tensor-with-another-tensor/110644

Thanks for your reply Jerly!
I exactly did the same as you did. But I don’t know what is the problem. i share part of my code here.

    # Normalizing the weight tensor
    SD = MyResnet2.state_dict(self)
    W2 = SD['linear2.weight'].clone()
    norm_W2 = W2.norm(p=2,dim=1, keepdim=True)
    normalized_W2 = W2.div(norm_W2.expand_as(W2))
    
    # Replacing the Normalized weights with previous ones
    MyResnet2.state_dict(self)['linear2.weight'] = torch.nn.Parameter(normalized_W2)
    SD = MyResnet2.state_dict(self)
    W3 = SD['linear2.weight']
    Residue = normalized_W2 - W3 # This must be zero tensor
    print(Residue)

And Residue is non-zero!

    # Replacing the Normalized weights with previous ones
    SD = MyResnet2.state_dict(self)
    SD['linear2.weight'] = torch.nn.Parameter(normalized_W2)
    MyResnet2.load_state_dict(SD)
    W3 = SD['linear2.weight']
    Residue = normalized_W2 - W3 # This must be zero tensor
    print(Residue)

Change yours to this one maybe be work.

Thanks for your reply!
I figure out was was the problem. Before, I was applying my manipulation inside the neural network class. But now I am doing in the main loop of the network training.