Is it possible to restrict the range of possible values that a Variable can take? I have a variable that I want to restrict to the range [0, 1] but the optimizer will send it out of this range. I am using torch.clamp() to ultimately clamp the result to [0,1] but I want my optimizer to not update the value to be < 0 or > 1. Like if my variable currently sits at a value of 0.1, and the gradients come in and my optimizer wants to update it by 0.5, which would make the new value -0.4, I want the optimizer to clamp it’s update to 0.1, so it will only get updated up to my bounds.
I know I can register a hook for the variable, which I tried, but that way I can only control the size of the gradient, not the actual update size. I’m sure if I just wrote a custom optimizer I could make it work but there’s no way I can beat the Adam optimizer.