Normalising parameters of a model

Hey guys,

I have probabilities as parameters in my model and I wish to normalise them (divide by their norm so they sum up to 1) as they are being optimised. Probably I should impose this constraint during training with:

with torch.no_grad():

However I haven’t found a way to so, as any assignments, like

model.probs = model.probs/model.probs.sum()

don’t work, bc you cannot assign a float to a parameter. Any suggestions?

Thank you in advance,
Nikos

I think you could try to apply a similar approach as is used in weight_norm, which uses hooks to normalize the parameters.