Difference ._grad and .grad

Hello,

I’m currently implementing A3C from scratch and I have some questions related to how the gradients are copied from the local model into the shared one. In particular, I’ve seen that after computing the loss and calling .backward() most of the versions already present on GitHub use the following slice of code:

for lp, sp in zip(local_net.parameters(), shared_net.parameters()):
     sp._grad = lp.grad

My questions are mainly 2:

  1. why did they not use simply sp.grad = lp.grad.clone() ?
  2. what is the difference between ._grad and .grad ?

Thank you in advance for the help.

I looked into this a bit, and apparently ._grad is an attribute of objects of type nn.Parameter. An underscore _ trails its name so it’s meant NOT to be modified according to PEP8 but that’s exactly what’s being done here and in various other implementations that I looked up and that you also mention.

I also couldn’t find a lot of info on the purpose of having ._grad, while grad is already there to store, retrieve and possibly modify the gradients, apart from them being there as attributes in the stub file and also that their value is same as that of .grad after a .backward call.

BTW, I’m not into RL but since you mention using .clone(), that’d make sense if you don’t want the changes in local_net’s parameters’ .grad to reflect in shared_net’s parameters’ .grad and just want to explicitly copy the gradients from one model to another.

Couldn’t help much here sorry!

Edit: Also checked whether .grad and ._grad refer/point to the same object in the memory by printing their reference address and by using is and they do!

Thank you so much Srishti, I think you answered all my doubts.

The two are exactly the same and point to the same object. ._grad is only here for backward compatibility reasons as some very old code still uses it. But you can replace it with no issue: sp.grad = lp.grad will do exactly the same thing

1 Like