I got a problem these days when implement my idea in pytorch.
In some layer’s in my network,I got a tensor(ChannelHW), and the Channel is different in different inputs.
I want to multiply a weight matrix(2D) with every Channel in the tensor, that is, Weightsubtensor(1H*W),I want the feature map in every channel multiply with the same weight matrix.
How can I implement my idea in pytorch?
using a for loop?
for i in range(channel):
out[i] = torch.bmm(weight,tensor[i,:,:])
can this implement backpropagation right?
I understand want you mean,the weight matrix can be broadcasted.
But in the pytorch code I always see torch.mm(A,B)(or bmm for batch) Instead of AB, and I’m wondering if I use AB, can my network backpropagation right(with Autograd)?