Does the generated graph retain inside a for-loop, even after the model output is overwritten?
Example:
for epoch in range(100):
for img in dataloader:
optimizer.zero_grad(); loss=0
for i in range(5):
out_img = M[i](img)
loss += criterion_L1(out_img, target[i])
loss.backward()
optimizer.step()
Above, I am using 5 models, each applied to the input image img
. So, my simple understanding is although out_img
is overwritten each time, backpropagation will apply to all the M
models; that is the graph is retained.
Please correct me if I am wrong.