sparsely represent 2 numbers in tensor

Torch.sparse results in a sparse representation of only zeros.

But if my vector/tensor is something like [1,1,1,1,1,1,0,0,0,0,0], is there any way I can make the 1’s sparsely represented as well and further reduce memory usage?

I have large tensors consisting of 1s and 0s (masks), and I need to further reduce the memory used so that I may be able to increase the batch size in training. Please share any pointers you may have to optimize the memory usage during training.