To me it’s sort of unintuitive, why wouldn’t using the tensor class work?
torch.tensor([ torch.tensor([i]).repeat(15) for i in range(0,5)])
the list is the same size and it’s really a matrix/tensor already…but somehow only:
torch.stack([ torch.tensor([i]).repeat(15) for i in range(0,5)])
tensor([[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1],
[2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2],
[3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3],
[4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4]])
worked.
Btw, is this this most efficient way to do it in a vectorized way?