Weights normalization

Hi all,

I have two weights and want to optimize them separately:

Weight1 = torch.tensor(torch.FloatTensor([1]), requires_grad=True)
Weight2 = torch.tensor(torch.FloatTensor([1]), requires_grad=True)

params = [Weight1, Weight2]
opt = torch.optim.Adam(params, lr=LR)

After each update step(), I want to normalize these weights to force them have sum(Weight1+Wright2)==2
To do that, I am using:

coef = 2/torch.add(Weight1, Weight2)
params = [coef*Weight1, coef*Weight2]

My problem is that after training, values of Weight1/Weight2 and params are different. For example, Weights are: tensor([ 0.7168]) tensor([ 0.7028]), and params is: [tensor([ 1.0099]), tensor([ 0.9901])]. Any idea?



just optimize one of them and calculate wieght2=2-wieght1

If I don’t normalize them together, weights can become negative.