I am new to pytorch. After reading the pytorch doc, I haven’t understood the difference between function and modules. the class of function includes forward and backward, and the modules class includes init, and forward. So if I want define a custom layer in network which need a custom backward algorithm(the gradient get from autograd is not what I want), how to do it? Use the function.backward?
But my model only a few layers need update follow formula 2, other layers update the weights follow formula 1, I think if I change the optim.optimizer, all layers update the weights follow formula 2, that is not what I want.So I think designing a custom layer is possible way to do it. I tried to use the autograd.function, but failed.