Clipped torch.cumsum

torch.cumsum is currently implemented as :
y_{i} = x_{i} + y_{i-1}

How do i get a running clipped version like so:
y_{i} = clip(x_{i} + y_{i-1}, min, max)

This isn’t the same as
y = torch.clip(torch.cumsum(x), min, max)
The clipping needs to be evaluated every iteration.

1 Like