How to force the weights of a Conv layer to be positive?


I tried to use several convolution layers as learnable filters or smoothers to process time series data. However, I found the some of the weights are still negative even after convergence. And this is definitely not a good property for a smoother, so I want to force the all the weights to be positive.

What I am currently doing is to build a weight tensor x with require_grads=True manually, and envelop it with a softplus function. Just curious if there is any more efficient way to handle it? Thank you!


I don’t think so, all included optimizers work on real domain, so unrestricted <-> positive mapping seems unavoidable.

The fancy way PyTorch offers for this is parametrizations, but I think for positive you still need to roll your own transformation (but it could be easy with exp or softplus or so).

Best regards



That perfectly solve my problem. Many Thanks! :grin: