How to perform backpropagation with noise

Hi, I wonder if it is possible to backpropagate with noise. For example, I may have a network look like this:

class ThreeLayerNetwork(nn.Module):
    def __init__(self):
        super(ThreeLayerNetwork, self).__init__()
        self.fc1 = nn.Linear(1, 2)
        self.fc2 = nn.Linear(2, 1)
        
    def forward(self, x):
        x = torch.relu(self.fc1(x))
        x = self.fc2(x)
        return x

I believe the gradient of fc1 is calculated based on that of fc2 during backpropagation. Now I would like to add noise to the gradient of fc2, and let the gradient of fc1 calculated based on the noisy gradient of fc2. I wonder in which way can I achieve it.

Thanks a lot!

You could use torch.nn.Module.register_full_backward_hook to access the gradients and manipulate the input_grad before returning it.