torch.nn.Sigmoid vs torch.sigmoid

what is the difference between these 2 sigmoid functions?

1 Like

AFAIK, torch.nn.Sigmoid calls torch.nn.functional.sigmoid in the background, and according to this answer, the functional and torch.xxx calls differ with their backwards implementation (which is more efficient and GPU capable in the torch.nn case).

1 Like

Hi Rohan!

torch.nn.Sigmoid (note the capital “S”) is a class. When you
instantiate it, you get a function object, that is, an object that you
can call like a function. In contrast, torch.sigmoid is a function.

From the source code for torch.nn.Sigmoid, you can
see that it calls torch.sigmoid, so the two are functionally
same.

(Why even have such a class / function object? Because,
although it isn’t the case for Sigmoid, in many cases when
you construct a pytorch function object you can pass in
parameters to the constructor that control the behavior of the
function. This is useful in cases where where the caller isn’t
able (or it might just be annoying) to pass in those parameters
when actually calling the function.)

As far as Alex’s comment, he references
torch.nn.functional.sigmoid, which is (probably) different
than torch.sigmoid. (Again, in any event, directly from the
source code, torch.nn.Sigmoid calls torch.sigmoid, so
the two are functionally the same.)

As for the post Alex links to, about
torch.nn.functional.sigmoid having a different backwards
implementation than torch.sigmoid, I suspect that this is out
of date (or perhaps just incorrect). Its documentation shows
that it’s deprecated in favor of torch.sigmoid() and that is
calls input.sigmoid(). I very much doubt that
torch.nn.functional.sigmoid and torch.sigmoid
do anything different (but since I can’t find any code for
torch.sigmoid or tensor.sigmoid it’s hard to tie this
up into a neat, little package).

Good luck.

K. Frank

6 Likes