Learnable parameter in Softplus

Hi, I would like to know how to set the scale parameter ‘beta’ to be learnable in torch.nn.Softplus( ).

Please suggest.
Thank you.

I think you can just make a new module (and probably also define the function yourself to get gradients).

class LearnedSoftPlus(torch.nn.Module):
    def __init__(self, init_beta=1.0, threshold=20):
        super().__init__()
        # keep beta > 0
        self.log_beta = torch.nn.Parameter(torch.tensor(float(init_beta)).log())
        self.threshold = 20
    def forward(self, x):
        beta = self.log_beta.exp()
        beta_x = beta * x
        return torch.where(beta_x < 20, torch.log1p(beta_x.exp()) / beta, x)

or somesuch (didn’t test it, really).

Best regards

Thomas

2 Likes

Thank you. I have checked it and it seems to work correctly (just missed the super().init() line.)

1 Like

Oh, indeed, thanks for pointing that out. I added that in just in case someone else tries to use the code.

Hi, thanks a lot for your code.
It would be perfect if you could replace self.threshold with the numerical value of 20 in the forward path.

1 Like