PyTorch equivalent of exclusive cumsum?

Hi All,

I was wondering if there’s a way to do an exclusive cumsum (like that implemented with Tensorflow’s cumsum). An example of this would be,

tf.cumsum([a,b,c], exclusive=False) => [a, a+b, a+b+c]  #standard cumsum
tf.cumsum([a, b, c], exclusive=True)  => [0, a, a + b]  #exclusive cumsum

Let’s say I have a Tensor of [1,2,3] and I take the cumsum of it. That would return a Tensor of [1,3,6]. However, if I wanted to take the “exclusive” cumsum which would be [0,1,3]? How exactly could be done efficiently? Are there plans to add this feature to PyTorch?

Thanks in advance! :slight_smile:

Hi Alpha!

The best I can think of is to use pytorch’s “standard” cumsum() and
use roll() to right-shift the result:

>>> t = torch.tensor ([1, 2, 3])
>>> res = t.cumsum (0).roll (1, 0)
>>> res[0] = 0
>>> print (res)
tensor([0, 1, 3])


K. Frank

1 Like

Hi K. Frank!

That solution looks fantastic!

Thank you! :slight_smile: