Overwriting convolution weight

I’d like to compute a 2D convolution with kernel weights that I calculate somewhere else. I.e. I want to treat the kernel like an input to the Conv2d module similar to hidden states in a recurrent network. Can I simply overwrite the ‘weight’ attribute of a torch.nn.Conv2d object with a Variable and then calculate the convolution? Or will that cause the gradients to be incorrect?
My concern is that the gradients might not be propagated back to the inputs correctly.

You can use torch.nn.functional.conv2d :). The module Conv2d is just a wrapper around the functional interface.

Thanks! I’ll do that.