Autograd `Function` for conv layers

I’m trying to write a “Conv2DFunction” in the same spirit as the class LinearFunction(Function) here: https://pytorch.org/docs/master/notes/extending.html
But, I can’t quite wrap my head around how to calculate gradients in the backward method, because with convolution kernels you can’t use simple matmuls to calculate gradients. Any hints or help is appreciated.
Thanks!

1 Like

Have a look at this post which gives more details on the backward pass in conv layers.

Thanks @ptrblck, in the simple case of one input channel, one output channel, and default values for other parameters, it is relatively easy. There are a couple of good blog posts covering it; like this one: https://medium.com/@pavisj/convolutions-and-backpropagations-46026a8f5d2c

But, when you have padding, dilation, stride, and input/output channels, it makes for a really mind twisting exercise!