Integrating the Laplacian of a Neural Network

Hi everybody,
I’m struggling with the integration of a neural network’s Laplacian. The problem is related to the speed of the training.
In particular, i’ve implemented an hp-vPINN(Partial Differential Equation solver), which requires to do a refinement of the domain. In my case, the domain is a simply square [-1,1]^2. I divided the domain into 9 different squares and, for every square, i had to integrate the laplacian of the neural network.
I’ve implemented a GaussianLegendre quadrature function in 2D, which is currently working but the training is really too slow. Any tips?

It’s hard to provide more specific recommendations without seeing your code. To begin, does your code use batching for parts of the code that are repeated? In other words, is the computation of the function at the quadrature points done with a for loop, or handled with batched operations?