Can i modify weights and gradients when training the network?

can i modify weights and gradients when training the network?
if yes, how to do it before and after updating parameters?

To modify gradients, use Variable hooks:

http://pytorch.org/docs/autograd.html#torch.autograd.Variable.register_hook

You can modify network weights directly. For example,

net.conv1.weight.data.clamp_(min=-1e-3, max=1e-3)
2 Likes

Thank you~ it’s helpful!

it seems that the hook will be activated during the backward process. And it will affect the gradient of the next layer in the backward automatically, right?

and i found a way to modify the weights and gradients too. But it’s not as awsome as yours, haha

in torch, i can modify weights and gradients directly by assign a tensor to it, like this

model.conv1.weight.grad.data = torch.ones(model.conv1.weight.grad.data.size()).cuda()

and this has slight difference from the hook method if you use optim.step( ). But if you write you own step( ) method, and modify the gradients inside the scope of your step( ), this method and the hook method will do the same thing. Am I right?

And BTW, the weights are Tensor, but the gradients are Variable, very strange, haha. I know there must be some consideration like grad of grad? Anyway :smile:

Thank you for your help!

pytorch is awsome!