Custom forward and backward functions in conv2d layer

Hello everyone,

I am trying to build a custom convolution layer (conv2d) where the element-wise multiplication will be later replaced with an approximate multiplication.
(I know it will get very slow but it is fine I will deal with this problem later.)
I have managed to re-build the forward part and it works fine but the problem is with the backward and the calculation of the gradient. I have re-written everything from scratch even the Conv2DTranspose layer. You can find the code in the link below:

During backward pass I get the error: " function convAppxBackward returned a gradient different than None at position 2, but the corresponding forward input was not a Variable"
I couldn’t figure out where is the problem, I would really appreciate any help!
thanks!

Hi,

The problem is that the signature of your forward: forward(ctx, x, height, width, windows, out_channels, weight, bias, n_channels): does not match the order you return the gradients in the backward: return grad_input, grad_weight, grad_bias, None, None, None, None, None. I guess it should be return grad_input, None, None, None, None, grad_weight, grad_bias, None.

1 Like