Strange behavior of logsumexp

Hi,

There seems to be a strange behavior regarding torch.logsumexp. I get different results depending on the dimension and value of the vector, is this normal?

a = torch.tensor(-130.)
b = torch.tensor(-130.).view(1,1)

torch.logsumexp(a,0) #returns -130
torch.logsumexp(b,0) #returns -130
torch.logsumexp(b,(0,1)) #returns -inf

but if i take a smaller number i get

a = torch.tensor(-80.)
b = torch.tensor(-80.).view(1,1)

torch.logsumexp(a,0) #returns -80
torch.logsumexp(b,0) #returns -80
torch.logsumexp(b,(0,1)) #returns -80

For an intermediate number, i don’t get -inf but a slightly different number

a = torch.tensor(-100.).view(1,1)

torch.logsumexp(a,(0,1)) #returns -99.981

Is this normal?

Thank you

Hi Yazid!

I can’t reproduce this on pytorch version 1.7.1. (It does look like a
minor bug.) Could you tell us what version you are using?

>>> import torch
>>> torch.__version__
'1.7.1'
>>> torch.logsumexp (torch.tensor (-130.).view (1, 1), (0, 1))
tensor(-130.)
>>> torch.logsumexp (torch.tensor (-100.).view (1, 1), (0, 1))
tensor(-100.)
>>> torch.logsumexp (torch.tensor (-80.).view (1, 1), (0, 1))
tensor(-80.)

Best.

K. Frank

Thanks KFrank for your quick reply.

I’m on 1.6.0. Just updated to 1.7.0 and the bug disappeared.