Is beta a trainable parameter in nn.Softplus Layer?

In the source code, it is shown as a constant, as follows.

__constants__ = ['beta', 'threshold']
beta: int
threshold: int

def __init__(self, beta: int = 1, threshold: int = 20) -> None:
    super().__init__()
    self.beta = beta
    self.threshold = threshold

Hello Tony,
Beta is indeed a constant and is not trainable in the standard definition.

I think that this post answers your question. You can define a custom operation where the beta is learnable.