Hi

Is it possible to constrain the values of the parameters of a layer to stay in a given range, for example to stay above 0.1 or below -0.1 ?

Thanks for your answers.

register a parametrization function for a certain weight may solve your problem

Thanks

Actually, what I’m trying to do is the following. I have a linear layer in my model and I want to add somewhere else in the model a new linear layer that would act as the inverse of the previous one.

Say the first linear model does : y = Ax+b

Then, the other layer would compute : y = (X-b)/A, where A and b are the weights and bias tensors of the first linear layer.

From what I understand from parameterization, this is exactly what I need to create the second layer, as a parameterization of the first one.

Do you think this is correct ?