Hi all,
I have the following program:
- create a vector x.
- Perform a finite-element physical simulation on x to obtain P. (P = f(x)). I make a DAG of interactions the environment makes on x.
- Compute error, E. (E = ||P_actual - P_simulated||^2 )
- Compute ∇E wrt x using E.backward()
- Update x using optimizer.step()
- Repeat steps 2 to 5.
The problem is, step 2 (the finite element simulation) takes way too much time to run everytime. Is there a method available that allows me to directly compute E by using the DAG already present? Since the DAG is used to compute the gradient, it should also allow me to perform a “forward pass” that completely bypasses having to run the finite-element simulation for every iteration, right?
So in essence, I simply wish to just run the finite-element simulation once to get a DAG, and use that DAG to perform my computations for every subsequent iteration.
Thanks!