[solved] Copy Gradient values

I have a instance of a neural network called myNet (that inherits from nn.Module). I would like to Perform a deep copy of this network (including gradient values of all parameters).

To perform an initial copy of the parameter values, I performed a deepcopy as follows:

myCopy = copy.deepcopy( myNet )

However, this does not copy the gradient values. Thus, I performed the following:

optimizerCopy = optim.SGD( netCopy.parameters(), lr=0.1, momentum=0.0 )
optimizerCopy.zero_grad()
for paramName, paramValue, in net.named_parameters():
  for netCopyName, netCopyValue, in netCopy.named_parameters():
    if paramName == netCopyName:
      netCopyValue.grad = paramValue.grad

This only does a shallow copy, though, of the gradient values.

How can I perform a deep copy of gradient values from myNet to myCopy?

I have found the answer to my own question (in case others would like to know). The code can be altered as follows to get the desired behavior:

optimizerCopy = optim.SGD( netCopy.parameters(), lr=0.1, momentum=0.0 )
optimizerCopy.zero_grad()
for paramName, paramValue, in net.named_parameters():
  for netCopyName, netCopyValue, in netCopy.named_parameters():
    if paramName == netCopyName:
      netCopyValue.grad = paramValue.grad.clone()
6 Likes