I have an array of loss values called loss_arr:
[tensor([0.6828], grad_fn=<SubBackward0>), tensor([0.152], grad_fn=<SubBackward0>)]
When I tried to backward the total loss, I tried to do:
loss = sum(loss_arr) loss.backward()
However it gave me:
RuntimeError: Trying to backward through the graph a second time, but the buffers have already been freed. Specify retain_graph=True when calling backward the first time.
If I backward like:
for example, there is no problem.
So, how can I backward the total loss?
And, why am I failed to backward?
Thank you all, have a nicely quarantined days…