How to clamp a group of parameters

Hello, I am not very good with pure torch, I am trying to add constraints to an optimization problem (following a suggestion that I found), so far I arrived at this point following the documentation and the tutorials:

pts = torch.tensor(points.astype(np.float64))
base_par = torch.tensor([100, 100, 0, 0])
aux_par = torch.tensor([1 / len(points)] * len(points))
base_par.requires_grad = True
aux_par.requires_grad = True
optimizer = torch.optim.SGD([
    {'params': base_par, 'lr': 1e-3},
    {'params': aux_par, 'lr': 1e-5},
], lr=1e-10)
loss = None
for i in range(1000):
    loss = -base_par[0] * base_par[1]
    with torch.no_grad():
        base_par[0].clamp_(1, max1 / 2)
        base_par[1].clamp_(1, max2 / 2)
        for par in aux_par:
            par.clamp_(1e-10, 1)
        torch.clamp(aux_par.sum(), 1, 1)

I need to add the constraint that the sum of all aux_par must be equal to 1. All I was able to found was to use clamp in the last line but in this way I think that it does anything valuable. Is there a way to define it? What am I missing? Thank you

I am not sure about your exact use case. Also, aux_par isn’t used in the code snippet of your question.

To constrain aux_par sum to be 1, I could think of two ways for now:

  1. Add an explicit loss term as (aux_par.sum()- 1)**2 ?

  2. If aux_par is used as part of the model, try interpreting aux_par values as logits and apply softmax() before using aux_par in any operation. In this way, its an implicit assumption that aux_par captures logits and softmax() gives the probability distribution from those logits.