Is there any different between torch.sigmoid and torch.nn.functional.sigmoid?

I am confused with torch.xxxx which support Tensor and Variable, it should be a function accepting Variable or just a normal function which handling input tensors. But what different between torch.xxxx and torch.nn.functional.xxx when input is a Variable ?

4 Likes

torch.* functions work for both Tensors and Variables. There’s no difference between torch.sigmoid and torch.nn.functional.sigmoid.

3 Likes

Actually, there is a difference.
The implementations in torch.xxx have the backward implemented using python calls, while the functional counterparts have their backward implemented entirely in C/Cuda, so the functional backward code is more efficient.

20 Likes

Thanks, i see.

More different is help message, there are help text for python object, no text for registered functions in C.

1 Like

Is this still the case? If so should be writing all my code using functional api?

2 Likes

I guess this is deprecated (at least as of Pytorch 1.0.0). But since one can find it so easily via google, I wrote this reply :slight_smile:

UserWarning: nn.functional.sigmoid is deprecated. Use torch.sigmoid instead.
  warnings.warn("nn.functional.sigmoid is deprecated. Use torch.sigmoid instead.")
5 Likes

Hi @fmassa: is this still the case?