Why my input tensor does need to allocate so much memory?

I followed this previous thread to measure how much memory is being allocated for the input tensor to my CNN. After applying the formula tensor.element_size() * tensor.nelement() I discovered that each example allocates 1.18 Mb. The dimension of this tensor is (batch_size, num_images, input_height, input_width, n_channels) = [1, 6, 128, 128, 3] and dtype is float32. These 6 images have a disk size from 4 to 7 Kb at most. It is normal that such an input tensor needs to allocates so much memory? Thx

Hi,

When we send things to the disk, we always compress them.
But when they are on memory, to be able to use them efficiently, we cannot keep them compressed. So yes it is expected that they take more time in memory than on disk.

1 Like

Good to hear that. I was afraid my input was too big to train the CNN in a reasonable amount of time. Thanks