Use torch.combinations with large parameters

Hi,
I would like to generate all combinations of elements in a list.I know I can use torch.combinations for that, but I need to use large parameters, which produces huge number of combinations. So the problem is that it does not fit into memory. For example :

torch.combinations(torch.arange(-10, 10, dtype=torch.int8).cuda(), 8)

yields

RuntimeError: CUDA out of memory. Tried to allocate 23.84 GiB (GPU 0; 4.00 GiB total capacity; 1024 bytes already allocated; 2.62 GiB free; 2.00 MiB reserved in total by PyTorch)

Is there a way to make kind of a pagination with this function ?

Hi Dofasol!

The root of the problem is that something is wrong with how
torch.combinations() is implemented. See this new thread
of mine and the github issue it references:

As an aside, generating all such combinations, either one at a time
or in “pages,” is straightforward enough just using python loops.

However, if torch.combinations() did work correctly and you
were working with a truly large problem, you could split your set
of elements into multiple subsets, apply torch.combinations()
on the subsets, and recombine the sub-combinations back into
combinations of elements of the original set. You could do this a
piece at a time (“pages”) so that you would never have to materialize
the entire (potentially very large) set of combinations all at once.

Best.

K. Frank