There are two neural networks. In theory, the loss which is computed at the end for both of them should rely on distinct sets of variables. But for some reason, I get a common error:
RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed).
Is it possible to check all the variables in the computational graph for the backpropagation (backward function)?
And Is it possible to see the variable where the graphs coincide?