Sum of Variable not equal to sum of its data

Hi,

I found the sum of the Variable not equals to its data, is this normal ?

Example Code:



import torch
t = torch.rand(10000).gt(0.05)
t.sum()
from torch.autograd import Variable
vt = Variable(t)
vt.sum()
vt.data.sum()

n = 0
for item in t:
	if item > 0:
		n = n + 1

n

python shell output:


Python 2.7.12 (default, Nov 19 2016, 06:48:10)
[GCC 5.4.0 20160609] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>>
>>> import torch
>>> t = torch.rand(10000).gt(0.05)
>>> t.sum()
9506
>>> from torch.autograd import Variable
>>> vt = Variable(t)
>>> vt.sum()
Variable containing:
 34
[torch.ByteTensor of size 1]

>>> vt.data.sum()
9506
>>>
>>> n = 0
>>> for item in t:
...     if item > 0:
...             n = n + 1
...
>>> n
9506

We can see that, t.sum() = 9506 and vt.sum() = 34 Is this a bug or I missed something?

Thanks

The reason is that Variable methods always return a size-1 tensor for scalar results. Since the type of t is ByteTensor, it unfortunately returns this.

This is an known issue and will be fixed when we properly introduce scalar types. The issue is tracked here: https://github.com/pytorch/pytorch/issues/1389

To workaround this, you can cast the comparison results to long.

2 Likes