Expand(), memory keeps increasing

When I run the following code, the memory usage keeps increasing iteration by iteration.

while True:
  a = Variable(torch.FloatTensor(32,16).cuda())  
  r = Variable(torch.FloatTensor(1).cuda())
  c = r.expand(a.size())  

If I use r.exapnd([32, 16]), memory usage does not increase.
Did I use expand() incorrectly? Does anyone have the same issue?

Thank you!

This does not make memory usage increasing:

>>> while True:
...     a = Variable(torch.FloatTensor(32,16).cuda())
...     s1,s2 = int(a.size()[0]),int(a.size()[1])
...     r = Variable(torch.FloatTensor(1).cuda())
...     c = r.expand((s1,s2))

I tried and I have the same bug when I direclty use torch.size().
I have no idea what causes this thing…