Hello, I am not very good with pure torch, I am trying to add constraints to an optimization problem (following a suggestion that I found), so far I arrived at this point following the documentation and the tutorials:
pts = torch.tensor(points.astype(np.float64))
base_par = torch.tensor([100, 100, 0, 0])
aux_par = torch.tensor([1 / len(points)] * len(points))
base_par.requires_grad = True
aux_par.requires_grad = True
optimizer = torch.optim.SGD([
{'params': base_par, 'lr': 1e-3},
{'params': aux_par, 'lr': 1e-5},
], lr=1e-10)
loss = None
for i in range(1000):
loss = -base_par[0] * base_par[1]
optimizer.zero_grad()
loss.backward()
optimizer.step()
with torch.no_grad():
base_par[0].clamp_(1, max1 / 2)
base_par[1].clamp_(1, max2 / 2)
for par in aux_par:
par.clamp_(1e-10, 1)
torch.clamp(aux_par.sum(), 1, 1)
I need to add the constraint that the sum of all aux_par
must be equal to 1. All I was able to found was to use clamp in the last line but in this way I think that it does anything valuable. Is there a way to define it? What am I missing? Thank you