Integration / anti-differentiation in PyTorch

I know that PyTorch is primarily an auto-diff library, but I was wondering if there are also ways to integrate efficiently? Unfortunately, I couldn’t find anything when searching (probably due to the overloading of the term “integrate” in engineering contexts).

For simple things, I could probably come up with something like

def integrate(f, a, b, points):
    x = torch.linspace(a, b, points)
    area = torch.sum(f(x))*(b-a)/points
    return area

but I was wondering if there are more efficient ways to do it, maybe something similar to scipy.integrate?

So there aren’t efficient ways to integrate similar to autodiff, so basically, you need (more refined) formulas like yours.
I haven’t seen implementations of the advanced quadrature rules in PyTorch.

It’s not quite what you want, but the Neural ODE code has ODE solvers which would give you some adaptivity even if they try to solve a much harder problem and are not as efficient for the integration problem:

Best regards

Thomas

Thanks a lot, that’s very helpful :slight_smile: