Concatenated aranges from start and end tensors without loop

starts and ends have the same size.
starts is variable.
ends is variable, but each end is larger than each start.

starts = torch.tensor([0, 1, 2, 3])
ends = torch.tensor([2, 4, 6, 8])
aranges = []
for s, e in zip(starts, ends):
    aranges.append(torch.arange(s, e))
result =, dim=0)

Can we get the same result without loop?

I don’t think this would be possible since each intermediate tensor has a different shape, so appending them to a list sounds like a valid approach.
At least I’m not aware of a loop-free approach, but let’s wait for others in case they can come up with an idea.