I have a network, for example input->conv1->bn1->relu->pool1. Now I have the gradient of pool1, and I want to calculate all the gradients of the network w.r.t the input, can it works: torch.autograd.backward(network, grad_tensors=pool1_gradient) ?
Hope someone can give answers, thanks!
You can check the doc but backward takes the output of your forward (not the network) and you can give
Thanks for reply, I wonder if I use torch.autograd.backward(pool1_output, grad_tensors=pool1_gradient) , gradients of all layers w.r.t the input can be computed or just the gradient of pool1 is computed?
When you call
.backward(), it will populate the
.grad field for all the leaf Tensor (such as nn.Parameter) that were used to compute your output.