I don’t see another approch, if the loss functions are different (in you current code snippet they are the same, so you could just use a single forward pass, but I assume that’s a typo).
For the second forward pass, you could set the requires_grad
attributes of all parameters in modelB
to False
, if you don’t need these gradients.
1 Like