Some confusions about nn.function and modules

Hello,

I am new to pytorch. After reading the pytorch doc, I haven’t understood the difference between function and modules. the class of function includes forward and backward, and the modules class includes init, and forward. So if I want define a custom layer in network which need a custom backward algorithm(the gradient get from autograd is not what I want), how to do it? Use the function.backward?

Thanks for your help!

Neither, you should use autograd.Function, see docs.
By the way, register hook maybe a better choice if it could work. You should try to make sure backward give the true grad.
examples of register_hook
and more by
https://discuss.pytorch.org/search?q=register%20hook

Thanks,chenyuntc.

Actually, I want to design a new layer, for some reasons, I update the weight(W):

should I define a new autograd.Function?

Maybe customizing torch.optim.optimizer is the way.

But my model only a few layers need update follow formula 2, other layers update the weights follow formula 1, I think if I change the optim.optimizer, all layers update the weights follow formula 2, that is not what I want.So I think designing a custom layer is possible way to do it. I tried to use the autograd.function, but failed.

you can specifiy some lays use optimizer1, others use optimizer2.

optimizer1 = optim.Adam(list(lay1.parameters()) + list(lay2.parameters()), lr = 0.0001)
optimizer1 = optim.CustomizeOptimizer(list(lay3.parameters()) + list(lay4.parameters()), lr = 0.0001)

Thanks! I will try it!