I’m new to PyTorch. So now I want to back propagate gradients not from the last layer but from any layer of the network. I have the gradient of the output with respect to the last layer weights (the last layer is fully connected layer without bias term), and I want to use this gradient to back propagate to get the gradients of the output with respect to other layers’ parameters. Can I do this? Thanks!
You can backpropagate gradient for any Variable that you have.
If you want to be able to backpropagate starting from intermediary layers, you will need your
nn.Module to output the intermediary
Variables corresponding to those so that you can call
.backward on them.