Hi, I’m working on a project where I need to calculate the gradient of the last layer of a network using a custom approach and then back propagate this custom gradient through the network. Is there a way this can be achieved in PyTorch?
You’ll want to use a torch.autograd.Function
object and manually define the forward and backward formulae for your custom funcion.
You can read more in the docs, Automatic differentiation package - torch.autograd — PyTorch 1.12 documentation