Hello everyone,

I am trying to build a custom convolution layer (conv2d) where the element-wise multiplication will be later replaced with an approximate multiplication.

(I know it will get very slow but it is fine I will deal with this problem later.)

I have managed to re-build the forward part and it works fine but the problem is with the backward and the calculation of the gradient. I have re-written everything from scratch even the Conv2DTranspose layer. You can find the code in the link below:

During backward pass I get the error: **" function convAppxBackward returned a gradient different than None at position 2, but the corresponding forward input was not a Variable"**

I couldn’t figure out where is the problem, I would really appreciate any help!

thanks!