I’m working on GANs model, the generator creates a tensor with size `(3,128,128)`

which I dumped with the pseudo-code

```
import torch
image = Generator(noise).clone()
tensor = image[0].detach().cpu()
torch.save(tensor, save_path)
```

The problem is that tensor costs more storage than the same tensor size, even they have the same `shape`

and `dtype`

```
>>> import sys
>>> import torch
>>> tensor = tensor.load(save_path)
>>> rand_t = torch.randn(tensor.shape)
>>> print(tensor.shape, rand_t.shape)
torch.Size([3, 128, 128]) torch.Size([3, 128, 128])
>>> print(tensor.dtype, rand_t.dtype)
torch.float32 torch.float32
>>> print(sys.getsizeof(tensor.storage()))
9830472
>>> print(sys.getsizeof(rand_t.storage()))
196680
```

I tried to dump those tensors, the output from the generator took `9.2MB`

and the random tensor took `197.4kB`

. I did read the pytorch’s documents but found nothing. Please help me to figure out what is the difference between them?