Hi everybody,
I’m struggling with the integration of a neural network’s Laplacian. The problem is related to the speed of the training.
In particular, i’ve implemented an hp-vPINN(Partial Differential Equation solver), which requires to do a refinement of the domain. In my case, the domain is a simply square [-1,1]^2. I divided the domain into 9 different squares and, for every square, i had to integrate the laplacian of the neural network.
I’ve implemented a GaussianLegendre quadrature function in 2D, which is currently working but the training is really too slow. Any tips?