Are Modules signaled updates by the Optimizer

Hi to all!

As fare as I understand it, an object of the th.nn.optim.Optimizer class updates parameters of a torch.nn.Module object by reference, so the Module is unaware of the updates.
Is there a way to put a monitor on the weight.data and bias.data? Is a Module signalled somehow during updates of its parameters?

Thank you for your help!

Hi,

You should never use .data as it is deprecated now.

I don’t think there is any mechanism to know when a Tensor is modified I’m afraid. The Optimizer does not even reference the Module, just the Tensors.
What would be your particular use case here?

Hi!
Thanks for your reply!
I’m trying to share updates among variables processing different features.
After some try-errors I just found out that it’s just the same as copying the individual variables as all updates will be the sames.
But now I’m concerned… I’m using .data to copy the same variable!!!