RuntimeError: Trying to backward through the graph a second time, when we store the variable

Your code is unfortunately not formatted correctly, but I assume the backward() operation is performed inside the loop. The first backward pass would free the intermediate activations from the forward pass, so that you wouldn’t be able to call backward a second time (if this is needed, use retain_graph=True, but it’s usually not the case). Since you are storing the e tensor in the L list and calculating the sum afterwards, the next backward pass would then try to backpropagate through both e tensors (the one from the current iteration [e1] as well as the one from the previous iteration [e0]). However, since the computation graph from e0 is already freed the error is raised.