I am wondering what I am doing wrong with register_hook since it does not seem to register a hook.

My goal is to modify grad matrix before weights are updated. For debugging purposes I just assigned zero matrix for a grad variable, but network trains perfectly.

So clearly hooks don’t work, but I cannot figure why. Here is my train function:

```
def train_cnn():
model = Net()
model.cuda()
criterion = torch.nn.CrossEntropyLoss()
model.train()
optimizer = torch.optim.SGD(model.parameters(), lr=lr, momentum=momentum, weight_decay=weight_decay)
for e in range(epochs):
agg_loss = 0
for data in trainloader:
x, y = data
x = x.cuda()
y = y.cuda()
outputs = model(x)
loss = criterion(outputs, y)
optimizer.zero_grad()
loss.backward()
with torch.no_grad():
hooks = []
for conv_layer in model.convos:
conv_layer.weight.retain_grad()
h = conv_layer.weight.register_hook(lambda grad: torch.zeros(size=grad))
hooks.append(h)
for fc_layer in model.linears:
fc_layer.weight.retain_grad()
h = fc_layer.weight.register_hook(lambda grad: torch.zeros(size=grad))
hooks.append(h)
optimizer.step()
for h in hooks:
h.remove()
```

Thank you!