How to operate Parameters when write a new layer?

Hello everyone:
I have 3*128 Tensor and I want to learn parameters to weights this Tensor. I want to use softmax to make these parameters’s sum to 1 in dim 0. I write a new layer but find it doesn’t work.so what’s wrong with these? (I use PyTorch 0.3.1)

class ClassesWeight(nn.Module):
def init(self,model_num,classes_num):
super(ClassesWeight,self).init()
self.w = nn.Parameter(torch.ones(model_num,classes_num))

def forward(self,input):
    exp = torch.exp(self.w.data)
    self.w.data = exp / torch.sum(exp, 0)
    output = torch.sum(input*self.w,1)

    return output

You should not assign values to the parameters data. This way they cannot be updated properly.

What do you want to do in general?

And what does “it doesn’t work” mean? Does an error occur or is there unexpected behavior?