Wired behavior of resize() function

Hi, when I call resize of Variable, the underlying data changes as follows:

criterion.x1
Variable containing:
-5
11
2
3
0
44
-1
32
15
0
[torch.FloatTensor of size 10]

criterion.x1.resize(10, 1)
Variable containing:
-5
4
0
-1
4
0
1
0
0
0
[torch.FloatTensor of size 10x1]

When I extract data out of Variable, same thing happens.
Did I miss something ?

Don’t use resize; use view.

corect . but why resize change values in tensor ? It 's really weired.

Probably because your original tensor was not contiguous. If resize_ changes the size of the tensor, it starts from the same point but uses a contiguous chunk of memory.

For example:

>>> m = torch.arange(0, 25).view(5, 5)
  0   1   2   3   4
  5   6   7   8   9
 10  11  12  13  14
 15  16  17  18  19
 20  21  22  23  24
[torch.FloatTensor of size 5x5]
>>> x = m[:,0]
   0
   5
  10
  15
  20
 [torch.FloatTensor of size 5])
>>> x.resize_(5, 1)
 0
 1
 2
 3
 4
[torch.FloatTensor of size 5x1]
3 Likes