Can we Create Neural network(Simple one such as Multi Layer perceptron) that only contains positive weights only?

I was wondering if there is a specific method to create a well performing neural network with only positive weights (I already tried clipping the weight before training or so and initializing the weights with only positive value ) but still doesn’t give a good results so is there any other method to do so ?

Thank You

Did you try the solution proposed here?

doesn’t give a good results

What is your baseline to establish if the result is good? Are you trying the match the accuracy of the same model but with default relative weights?

Thank you for your reply , I mean by a good result lower results but acceptable because we are not using negative weight that has an impact on the back prob algorithm.

you mean by P.data are the weight of the neural net and clamp function will clip them to 0 (if they are negative)?

I mean by a good result lower results but acceptable because we are not using negative weight that has an impact on the back prob algorithm.

Sure that makes sense.

you mean by P.data are the weight of the neural net and clamp function will clip them to 0 (if they are negative)?

That’s right.
Is it what you already have tried here:

(I already tried clipping the weight before training

?

Thank you I tried that and the results are acceptable but my real targeted question were can I have some think like model.weight.data.uniform(0.0,1.0) to init the weights in positive range and keep clamping them along the way to be sure that they don’t go beyond 0 and 1 (I tried that also and it seems to me that the results are horrible around 20% acc) is there any solution for this ?

or how to not get affected to much by the lack of negative weights ?

The only way would be having an unsigned float type for the parameters, which does not exist. Otherwise, you would need to clamp something (params or grads) at some point anyway.

Btw I am curious, why do you need to have only positive weights?