Quick Detach() Question

If in2 is detached, you can’t call loss.backward at all, since net2(in2) are not taken into account by the computation of the gradient.

When a variable is detached, the backward computations will not visit the branches that start from this variable (all the operations done on it).

So as @albanD suggested, you need to create a new graph (hence the new variable requiring a gradient) if you want to compute a gradient with respect to the new network (net2) independently on previous operation (net1).