I was wondering if there is a specific method to create a well performing neural network with only positive weights (I already tried clipping the weight before training or so and initializing the weights with only positive value ) but still doesn’t give a good results so is there any other method to do so ?
Thank you for your reply , I mean by a good result lower results but acceptable because we are not using negative weight that has an impact on the back prob algorithm.
Thank you I tried that and the results are acceptable but my real targeted question were can I have some think like model.weight.data.uniform(0.0,1.0) to init the weights in positive range and keep clamping them along the way to be sure that they don’t go beyond 0 and 1 (I tried that also and it seems to me that the results are horrible around 20% acc) is there any solution for this ?
or how to not get affected to much by the lack of negative weights ?
The only way would be having an unsigned float type for the parameters, which does not exist. Otherwise, you would need to clamp something (params or grads) at some point anyway.
Btw I am curious, why do you need to have only positive weights?