Equivalent of numpy split

In np.split you can give indices_or_sections as input to specify where the splits happen. Is there a pytorch equivalent? Or a vectorized solution for the same?

There is torch.split which is the same for the the case where you give it a single int. If you need to split into non-equal-sized arrays, you’ll need to use a loop and indexing or calls to narrow.

x = torch.arange(0, 9)
x.split(3)
(
  0
  1
  2
 [torch.FloatTensor of size 3],
  3
  4
  5
 [torch.FloatTensor of size 3],
  6
  7
  8
 [torch.FloatTensor of size 3])

Thanks! I am aware of torch.split but I needed unequal splits. My current solution uses loops was hoping to avoid it. Seems like it can’t be done! Thanks for the response!

If you think that version of split would be generally useful, you can write a feature request on the GitHub issue tracker.

http://pytorch.org/docs/master/torch.html?highlight=chunk#torch.chunk

That’s not quite the same: chunk still splits into equal-sized chunks. You just specify the number of chunks instead of the size. numpy.split let’s you specify sizes for each individual chunk.

I implemented a split function for different sizes in this feature request.
Maybe it’s what you need.

1 Like

Thanks! This is exactly what I need though it still uses python for loops. But hopefully your version gives me enough speed up. :slight_smile: