How to train my custom parameter?

You could write your own nn.Module that includes a trainable alpha parameter:

class MyLayer(nn.Module):
    def __init__(self, *args, **kwargs):
        self.alpha = nn.Parameter(torch.tensor(0.))
        # Other necessary setup
        super().__init__(*args, **kwargs)

    def forward(self, x):
        # Necessary forward computations
        return output

Using this MyLayer module as part of a network architecture will allow you to learn alpha along with all of the other usual parameters. Hope this helps!

2 Likes