Can conv share weight?

Hello. I have a network having two image as input, and conv seperately but with same conv model, which means that the conv module for two image share the same weight. Can the code below do the thing as I want?
forward(im1, im2):
conv1 = self.conv(im1)
conv2 = self.conv(im2)
Can this works when backprop?

Yes, it just add the gradient of two branch.

1 Like

Can I get some examples or code snippets about “add the gradients of two branches”? I don’t understand meaning of “add the gradients of two branches”