KelleyYin
(Kelley Yin)
1
There is a question about Softmin function in different version of Pytorch.
In version 0.4.0, if I do the following.
import torch.nn.functional as F
a = torch.randn(3, 4)
b = F.softmin(a, dim=-1).sum(dim=-1)
b is equal to tensor([ 1., 1., 1.])
In version 0.4.1,
a = torch.randn(3, 4)
b = F.softmin(a, dim=-1).sum(dim=-1)
b is equal to tensor([-1.0000, -1.0000, -1.0000])
Is this a bug ?
I did’t have version 1.0 installed, if anyone knows way, please let me know.
K_Frank
(K. Frank)
2
Hello Kelley!
This appears to be a known and fixed bug: See this github issue:
(For what it’s worth, when I run your softmin
test on pytorch
version 1.0.1, I get the right answer, that is, the softmin
values
sum to 1.)
Best.
K. Frank
1 Like