```
>>> a = torch.randn(1,2)
>>> a
0.4613 -0.6274
[torch.FloatTensor of size 1x2]
>>> torch.sum(a)
-0.16612520813941956
```

According to the official doc, `torch.sum`

should return a Tensor, so why I got a float number?

```
>>> a = torch.randn(1,2)
>>> a
0.4613 -0.6274
[torch.FloatTensor of size 1x2]
>>> torch.sum(a)
-0.16612520813941956
```

According to the official doc, `torch.sum`

should return a Tensor, so why I got a float number?

There are two versions of `torch.sum`

: one which sums all elements in a Tensor and returns a float, and one with sums all elements in each dimension of a Tensor and returns a Tensor.

```
>>> a = torch.randn(1,2)
>>> a
1.1424 0.5776
[torch.FloatTensor of size 1x2]
>>> torch.sum(a)
1.7199773788452148
>>> torch.sum(a,0)
1.1424
0.5776
[torch.FloatTensor of size 2]
>>> torch.sum(a,1)
1.7200
[torch.FloatTensor of size 1]
```

But the official docs says they will both return a Tensor

http://pytorch.org/docs/master/torch.html#reduction-ops

Ah, interesting! It looks like a recent change. You linked to the docs for the master branch, but the same docs for v0.3.1 show the two different return types. I guess you’re using an older version of PyTorch, but looking at the latest docs.

```
>>> torch.__version__
'0.3.1.post2'
```

I prefer they both return a Tensor… Why to change that?

Hi,

In master, `Tensor`

and `Variable`

are now the same thing. So the behaviour is the same as what `Variable`

used to be: It will return a `Tensor`

with a single element in it.

In 0.3.1, Tensor sum was returning a python float while a Variable sum was returning another Variable with a single element. This weird behaviour is now fixed in master.

1 Like