I am wondering how I can use `register_hook`

to modify filters gradients for the convolutional layer.

Say my convo layer has 10 filters, 3 channels, 5x5 --> (10, 3, 5, 5).

Each 32x32 filter should have its own gradient e.g G.

I have the corresponding tensor of the size (10, 3, 5, 5) where 5x5 matrix M is a matrix that I want to multiply filter’s channels gradient e.g MxG and use this as a gradient. Note M is different for each filter.

For fully connected network I simply did

`model.layer.weight.register_hook(lambda grad: torch.t(torch.mm(M, torch.t(grad))))`

How do I do it for convolutional filters?

lambda grad will iterate over (3, 5, 5) tensors. How do I update my gradient in this case…?