How to constrain a torch variable

Hello guys,

I would like to know if there is any way I could possibly constrain a torch tensor that is a neural network parameter that needs to take on only positive values. Is there any other way other than torch.clamp?

Apply a relu activation

Thanks you for the reply. However, by applying relu activation, the values that are negative will be mapped on to zero. But I want them to be positive.

Then you should apply and absolute value function