How to change/initialize values without "leaf variable has been moved into the graph interior"

I am doing an abnormal network that needs to have values manually set to specific values. Currently porting to the newest pytorch, but everything was working as expected in an older version. When I now do (nn.Conv2d).weight = (other nn.conv2d).weight I am getting the “leaf variable has been moved into the graph interior.” How can I make this change? If it helps I only need to do this at the point of creation, but I need to do it for a conv2d, a batchnorm2d, and a custom nn.Parameter.

Hi,

Since I guess you don’t want gradients to flow back this op, you should do:

with torch.no_grad():
    (nn.Conv2d).weight.copy_((other nn.Conv2d).weight)
1 Like

That did it, Thanks!