How to make group sum of single row tensor?

So, I have a row tensor like this

values = torch.tensors([0.2, -0.1, 0.9, -0.4, 0.3, 0.1, -0.5, -0.7])

And also, I have a tensor that is the cumsum of the length of sub Tensor

cum_seqlens = torch.tensors([0,2,2,4], dtype=torch.long)

I want to sum the subset of values based on the cum_seqlens like this:

# because the cumsum is 2, 2, 4, the tensor is split into [0.2, -0.1], [0.9, -0.4], and [0.3, 0.1, -0.5, -0.7]
torch.Tensor([0.2+(-0.1), 0.9+(-0.4), 0.3+0.1+(-0.5)+(-0.7)])

I could just split the tensor, but the result is a tuple of sub tensors. I want to keep the tensors intact because I need the grad of process.

I already search for the solution and found this similar problem here. But, the accepted answer is user encoding_indices while I don’t have it. I also have row tensor while theirs is column tensor.

If this is the reason you need the tensors intact, splitting into smaller should be fine for you, as long as you use view operations that autograd can understand, e.g. the slice operation.
It is definitely trickier however, if you have a really large tensor, and worry about performance.

My tensor length could reach to 81920 elements and be splitted into 8000 sub tensor. will it hurt the performance if I’m using split?

Yes definitely, though depending on how perf sensitive your case is, maybe it is still worth checking .