How to use sigmoid on only positive values?

I have a binary classification model, that in the latest linear layer, it outputs only positive values (don’t ask why, that’s a different matter), now when i give the final layer’s output to torch.sigmoid, all the results are above 50%, because the final linear layer is only outputting positive values, how can i fix this and output probability? is there any “positive only” sigmoid in pytorch?

Hi Richard!

This is the core issue you need to address – tweaking sigmoid() to
undo whatever damage you’ve done would be a sideshow.

Well, it actually does matter, because whatever you’ve done is
breaking the interpretation of the output of your model as reasonable
logits (that become reasonable probabilities when passed through
sigmoid()).

My advice: Tell us what you are actually doing and what your
motivation is. My guess is that you’ll be able to use standard
techniques to build your classifier. But if your problem has some
unusual properties that prevent you form using standard techniques,
you’ll likely get better advice from the forum if you describe your
problem and what makes it atypical.

(As an aside, and against my better judgment, I will comment on some
mathematical structure. But please don’t actually try doing this. You
want to convert “positive-only” things that look sort of like logits into
“normal” logits that range from -inf to inf, and that therefore become
“normal” probabilities that range from 0 to 1 when passed through
sigmoid(). The log() function maps the positive half real line to the
whole real line, so you could use the following conversion:
normal_logit = log (positive_only_logit).)

Good luck.

K. Frank