Hi,
There seems to be a strange behavior regarding torch.logsumexp. I get different results depending on the dimension and value of the vector, is this normal?
a = torch.tensor(-130.)
b = torch.tensor(-130.).view(1,1)
torch.logsumexp(a,0) #returns -130
torch.logsumexp(b,0) #returns -130
torch.logsumexp(b,(0,1)) #returns -inf
but if i take a smaller number i get
a = torch.tensor(-80.)
b = torch.tensor(-80.).view(1,1)
torch.logsumexp(a,0) #returns -80
torch.logsumexp(b,0) #returns -80
torch.logsumexp(b,(0,1)) #returns -80
For an intermediate number, i don’t get -inf but a slightly different number
a = torch.tensor(-100.).view(1,1)
torch.logsumexp(a,(0,1)) #returns -99.981
Is this normal?
Thank you