# Constrain outputs in a regression problem

Hi, everyone.

I am attempting to constrain some outputs of my regression network, say x, y, z = model(data), where x, y, z are scalars. The constrain that I want to impose is that when predicting all three dependent variables, the condition “x + y <=1.0” must be honored. Given this description, can I implement this in a forward function?

Thank you!

You could use the torch.clamp() function found here and set max to 0.5 for each and min to whatever you want. Then x+y could never be greater than 1.

thanks for replying. What if either x or y can be larger than 0.5?

Ok I found a way. You could do something like this

``````ratio = 1/(x+y)
x *= ratio
y *= ratio
``````

Then the output of x + y will always be 1. The only problem is that if x+y is already less than one this will make it equal to one. To solve that you could just use an if else statement like this

``````if x+y > 1:
ratio = 1/(x+y)
x *= ratio
y *= ratio
else:
continue
``````

Thanks, Dwight! I will give your suggestion a try!

Ok let me know if there are any more issues.

Do you think `softmax` might help here?

Hi. Yes, indeed, it did help. So, what I did was I had one Linear layer with a single output for x, which is then connected to a loss function (x_hat, x) and to another that has to make sure that (x_hat + y, ones(size(x_hat+y)). So, in total I had three loss functions. At last, I had to weigh each respective loss function.