I am getting the next problem:
RuntimeError: Trying to backward through the graph a second time, but the buffers have already been freed. Specify retain_graph=True when calling backward the first time.
I could fix it using: error.backward(retain_graph=True)
But why does happen? I did not call twice inside of the loop.
I have to say that in each iteration I am using the same variables: value_path and value_body
for i_emb in range(emb_times):
optimizer_s.zero_grad()
out_data = net(value_path,value_body)
out_constant = (out_data.div(out_data.norm(p=2, dim=1, keepdim=True))).detach()
error = ((out_data - out_constant).pow(2)).sum()
error.backward(retain_graph=True****************)
optimizer_s.step()
I do not understand why that happens.