Hello everyone:

I have 3*128 Tensor and I want to learn parameters to weights this Tensor. I want to use softmax to make these parameters’s sum to 1 in dim 0. I write a new layer but find it doesn’t work.so what’s wrong with these? (I use PyTorch 0.3.1)

class ClassesWeight(nn.Module):

definit(self,model_num,classes_num):

super(ClassesWeight,self).init()

self.w = nn.Parameter(torch.ones(model_num,classes_num))`def forward(self,input): exp = torch.exp(self.w.data) self.w.data = exp / torch.sum(exp, 0) output = torch.sum(input*self.w,1) return output`