Traditionally,y = XW+b
. What I want to do is y = 0.5XW+0.2b
.
What I did is using a = nn.Parameter(torch.tensor(0.5).cuda())
.
Then x = a*self.fc(x)
But this one scales weight and bias simultaneously.
Is there a way I can add a parameter to this respectively which will be used in backpropagation?