How to force the weights of a Conv layer to be positive?

Hi,

I tried to use several convolution layers as learnable filters or smoothers to process time series data. However, I found the some of the weights are still negative even after convergence. And this is definitely not a good property for a smoother, so I want to force the all the weights to be positive.

What I am currently doing is to build a weight tensor x with require_grads=True manually, and envelop it with a softplus function. Just curious if there is any more efficient way to handle it? Thank you!

Best,

I don’t think so, all included optimizers work on real domain, so unrestricted <-> positive mapping seems unavoidable.

The fancy way PyTorch offers for this is parametrizations, but I think for positive you still need to roll your own transformation (but it could be easy with exp or softplus or so).
https://pytorch.org/docs/stable/generated/torch.nn.utils.parametrize.register_parametrization.html?highlight=parametrization#torch.nn.utils.parametrize.register_parametrization

Best regards

Thomas

2 Likes

That perfectly solve my problem. Many Thanks! :grin: