PyTorch is great but I am struggling with a baffling memory leak. I am using pyTorch version 1.12 via Anaconda on an Ubuntu machine.

This leaks memory:

```
import numpy as np
from torch import Tensor, LongTensor, optim
Xin_memory = Tensor(2000100, 60, 50)
Xin2 = Tensor(1,60,50)
for i in range(2000000):
nda = np.empty((60,50))
Xin_memory[i+i,:,:] = Tensor(nda)
# Xin2[0,:,:] = Tensor(nda)
print i
```

This does not:

```
import numpy as np
from torch import Tensor, LongTensor, optim
Xin_memory = Tensor(2000100, 60, 50)
Xin2 = Tensor(1,60,50)
for i in range(2000000):
nda = np.empty((60,50))
# Xin_memory[i+i,:,:] = Tensor(nda)
Xin2[0,:,:] = Tensor(nda)
print i
```

Help! Do I misunderstand what the assignment is doing? I am under the impression that the assignment to Xin_memory will copy the values and then the Tensor will no longer exist. Eventually, nda will not exist either and will be garbage collected. I think this code is slowly eating away at memory in some long-running code I am working with. Please let me know if I can provide any additional information.

Thank you.