I did not expected the below output; as I thought that sigmoid would move the values of a tensor to be between 0 and 1?
Perhaps its related to handling finite precision of floats; but I didn’t manage to track down the relevant docs.
> sig tensor([1.0000, 1.0000, 1.0000])
Yep. It just rounds them to 1. But it is in valid range, i.e. between 0 and 1. I also think that it just happens during printing.
val = torch.tensor([10.0,11.0,12.0]).sigmoid()
for i in range(val.shape):
torch.set_printoptions(precision=5) function might help to view the output more nicely.