How to use orch.autograd.Function to change the gradient

I have a huge network (e.g. resnet)
I like to change every gradient for each function which is computed via back-propagation.
Let say I like to add a constant C to it.
can someone help me to understand how via torch.autograd.Function and aaply i can do it?
I think my function should be like this?

class Change(torch.autograd.Function):
    def forward(ctx, input,C):
        return input

    def backward(ctx, grad_output):
        input, = ctx.saved_tensors # why we should do input, not intpu?
        grad_input = grad_output.clone()
        return grad_input+C

I am not sure where should i put/apply it though

Hi @isalirezag,

Do you want to apply it during back-propagation or after?

In the first case, you should be able to achieve it with a backward hook:

def hook_fn(module, grad_input, grad_output):
    if isinstance(grad_input, (list, tuple)):
        return tuple((grad + C if not grad is None else None for grad in grad_input))
    elif isinstance(grad_input, torch.Tensor):
        return grad_input + C

# Add the backward hook on all the layers
for l in rn.modules():

in the second case, you can do this with:

for p in model.parameters():
    if p.requires_grad: = + C
1 Like