I have a network, for example input->conv1->bn1->relu->pool1. Now I have the gradient of pool1, and I want to calculate all the gradients of the network w.r.t the input, can it works: torch.autograd.backward(network, grad_tensors=pool1_gradient) ?
Thanks for reply, I wonder if I use torch.autograd.backward(pool1_output, grad_tensors=pool1_gradient) , gradients of all layers w.r.t the input can be computed or just the gradient of pool1 is computed?