How to Share network weight

I constructed a network configuration. Two forward passes were made. The second forward used
x = nn.functional.conv2d(x,self.conv1.weight[0:1,:,:,:],padding=1),
So when loss2 backward, the conv1.weight is changed?, in fact when training the model ,the loss2 always is 2.3…,but loss1 is declining

Please be clear with your question and provide some code, if you have.
For example, I am not sure what is loss1, loss2 etc. Unless the question is clear, the chance is very less that you will get an answer.