# How to chunking a 2-D tensor

Suppose I have a 2D tensor A and a 1-D tensor C recording the length of each chunk.
For example, C=4, means I want A[0:4]; C=5, means I want A[4:9], so on so forth.

It is easy to manually iterate over the C to find all these chunks.
However, this is very slow.
I am wondering if there is efficient operation that can get these chunks in a faster way?

Thanks for the help !

You could use `split` for that:

``````x = torch.arange(20)
lengths = [4, 5, 11]
x.split(lengths, 0)
``````
1 Like

Thanks !
This is what I need !

I want to ask a related question.
What if I need to torch.sum() for each chunk, is there any way to avoid the iteration ?

Not that I’m aware of.
However, you could manually split the tensor and sum the chunks:

``````start_idx = 0
for l in lengths:
print(x.narrow(0, start_idx, l).sum())
start_idx += l
``````

I haven’t timed it, but depending on the tensor shape, you are better off using `split` and then summing each chunk in a loop.

Thanks !
I have tried manually and `split`, manually is much slower than `split` and `sum`.
It seems we can not avoid the iteration.