Integrating 6 variables in torch?

Hi there,

I am using pytorch to do variational energy minimization and I am trying to solve a definite integral:

\int_0^{2\pi} \int_0^{2\pi} \int_0^{\pi} \int_0^{\pi} \int_0^{L} \int_0^{L} f(x1, x2, theta1, theta2, phi1, phi2) dx1 dx2 dtheta1 dtheta2 dphi1 dphi2

The function is not simple to integrate by hand and I have only seen that scipy.nquad can handle something like this. But scipy.nquad doesn’t allow the use of .backward() and I couldn’t find anything after looking around.

Would anyone here be able to point me towards something that might be helpful?

Much appreciated! Thank you!!!

Hi Deepsana!

I assume that your integrand depends on some parameters (with respect to
which you are not integrating) and you wish to use autograd to compute the
gradient of the value of the integral with respect to those parameters.

Numerical integration becomes increasingly difficult and expensive as the
dimensionality of the integrand increases. Six dimensions is already likely
to be problematic.

I see that some of your variables are angles. If your integrand becomes
highly oscillatory (for some values of your parameters), you are likely to be
in a world of hurt.

If you can integrate over any of the dimensions analytically – especially any
oscillatory dimensions – you should do so, as it will be easier to integrate
numerically over the remaining dimensions.

Pytorch does offer trapezoid(), a tool for one-dimensional integration, but
you’re unlikely to be successful using it in a nested fashion for a six-dimensional
integral.

One approach would be to use a simple version of Monte Carlo integration
where you randomly select a set of points from the six-dimensional region over
which you are integrating and then average the integrand over those points.
It might make sense to use the same fixed set of points as you optimize your
parameters.

This can be expensive – lots of random points – but is straightforward and
you will be able to backpropagate through the value of the integral without
any problem (except expense).

If you know anything about the structure of your integrand, you can refine
this approach with importance sampling where you cluster your random
points more densely in regions where the integrand is large or is varying
rapidly.

Last, torchquad (github and paper) is supposedly a pytorch multidimensional
integration package that supports backpropagation. (I have not used it.)

If it works with autograd as advertised and they’ve done a good job implementing
their chosen suite of integration methods, I would suggest that you start with
VEGAS Enhanced – an adaptive importance / stratified-sampling Monte Carlo
integration algorithm.

Good luck.

K. Frank

1 Like

Thank you very much for your recommendations!!! I will start exploring those.