Restrict range of variable during gradient descent

I would copy the code for the Adam optimizer and modify it to do what you want.

1 Like