Hi, I would like to implement tanh(weight) instead of weight, for example:
I am not sure if there is a problem with my code,so thank you so much for your help.
Thank you for your kind reply. Yes, weight is a learnable parameter, and use tanh(weight) to limit the weight range.
Apart from with torch.no_grad, what is the difference between putting self.conv1[0].weight[:] = torch.tanh(self.conv1[0].weight[:]) before def forward(self, x): and inside def forward(self, x):.
If all you want is to limit the values to be in a range, why not using the torch.clamp?
I just want to make sure you use the right tools here, tanh is an activation function, and it transforms your input to be in that range, depending on the task it might not give you what you desire.
So if you know what you are doing then fine, in case you didn’t know, check torch.clamp out, maybe this is what you were looking for?
Hi, thank you for your suggestion. I have tried torch.clamp, besides, I also want to use tanh (weight) for weight regularizer for loss function, for example: