Modifying parameters after back propagation

Hi everyone!

I want to know if it’s possible to modify the parameters of an nn.Module after backpropagation has been done.

I want to clamp the parameters of a module such that they are increasing ie. C[0] < C[1] < … < C[n-1]. But only after they have been updated via gradient descent.

Thanks!
PS. not sure what the category would be so I left it under “Uncategorized”

Yes, similar to how the optimizer.step() function manipulates the parameters, you could apply a similar logic by e.g. manipulating the parameters inplace in a no_grad() context.

Thanks ptrblck for your reply!

I should have been more clear - I want to do it automatically whenever the backwards call is done. I’m developing an nn.module that can be imported and used by just adding it to another module. I don’t want end-users to modify their code so they have to manually clamp the values of my module after the optimizer.step() is called.

I looked into backward hooks but they modify the gradients before the parameters are updated. I want to make something like a “backward_post” hook that executes after the parameter gets updated.

Maybe using a parametrization could be useful in your case. Although a parametrization would modify the parameter before it is used, I think this would achieve the same result. Here is an example of how you could do that:

from torch import nn
import torch.nn.utils.parametrize as parametrize
class Clamp(nn.Module):
    """Reparametrize a variable to be bounded in [0, 1] with a clamp"""
    def forward(self, X):
        return X.clamp(0, 1)
    def right_inverse(self, Z):
        return Z

linear = nn.Linear(1,2) 
parametrize.register_parametrization(linear, 'weight', Clamp())
1 Like

Thanks grudloff! Parametrization solves my problem

I was thinking about using pre_forward_hooks but, if I am not mistaken, those are supposed to be used for modifying the inputs to the module and not the module itself. Parametrization is basically that but for parameters.

I also didn’t know about parameterization and read the tutorial on it after seeing your reply. So thanks for introducing it to me too.

1 Like