Hi,
Is there any way to supply a derivative formula to the Autograd engine? I want to build a network and compute gradients using a custom formula. Any help are greatly appreciated!
Hi,
Is there any way to supply a derivative formula to the Autograd engine? I want to build a network and compute gradients using a custom formula. Any help are greatly appreciated!
https://pytorch.org/tutorials/beginner/examples_autograd/two_layer_net_custom_function.html
You just have to overwrite the backward function in the nn.Module
Note that the backward function is not on the nn.Module
but on an autograd.Function
!
You have more details here as well: Extending PyTorch — PyTorch 1.8.1 documentation
Hups! Have to say never tried it thanks for the clarification.