Constraint parameter as in tensorflow/python/keras/constraints

Hello, is there are any feature in pytorch that resembles tensorflow/python/keras/constraints, for example in keras it is possible to constraint the parameter values using:


to avoid negative values, i know in pytorch you can apply function after the optimization step as suggested in:

but this doesn’t seem the cleanest way to do it, i mean, it would be much better if you could register the constraint inside your model class, i know that torch.distributions.constraints module exists but i’m not sure it applies to my case, any hint on this?

1 Like