How can I do convolution backward manually without forward if I have an input tensor, a grad_output and a weight tensor.
I found that conv2d
use ConvNd = torch._C._functions.ConvNd
for forward passing.
Also, I found a ConvBackward
function in here. But I don’t know how to use it.
Maybe there were something I missed. Why do you want to do backward propagation without forwarding first? The computational graph is built in the forwarding pass, without which it is not possible to do a backward propagation.
I want to use a coustom forward function with the standard convolutional backward function.
Maybe I should use the standard function when I need a backpropagation. Thank you.
Hi,
There is indeed a register_forward_hook
function available. It is called every time after forward
function, but it is not intended to change the result of forward computation. Because modifying the forward result in the hook will not cause a change to the computational graph, which can result in wrong gradient calculation in the backward phase. I guess this function is used for logging or something similar.
If you would like to implement a custom forward function, it might be better to subclass the Conv
layer and provide a custom forward implementation.
I don’t think there’s a way to do that at the moment because ConvBackward
function is not constructible from Python, and we didn’t think of making that possible. It might be supported after the current autograd refactor.
Thank you for your reply.
Now I use a register_backward_hook
to process the grad_input
of a Conv layer without bias. It raise TypeError: expected Variable, but hook returned 'NoneType'
when the hook function return grad_data, grad_weight, None
to replace the original grad_input.
It seems that I cannot return a None
. But what should I return if I don’t have a bias?
Hi,
I had a similar problem, but for me it was a customized backward step, when calling the standard convolution in the forward pass. I built a workaround by calling the convolution from torch.nn._functions
, as you can see here: Call backward on function inside a backpropagation step
Maybe this could help you? Maybe anyone in here has some ideas regarding my problem?