Writing my own derivative formula


Is there any way to supply a derivative formula to the Autograd engine? I want to build a network and compute gradients using a custom formula. Any help are greatly appreciated!

You just have to overwrite the backward function in the nn.Module

Note that the backward function is not on the nn.Module but on an autograd.Function!
You have more details here as well: Extending PyTorch — PyTorch 1.8.1 documentation

Hups! Have to say never tried it :slight_smile: thanks for the clarification.