RuntimeError: Trying to backward through the graph a second time,

Hi,

I am just getting started with pytorch.

I am trying to run the below listed code and get the error “RuntimeError: Trying to backward through the graph a second time, but the saved intermediate results have already been freed. Specify retain_graph=True when calling backward the first time”.

I have tried to replace 1 with 2 and it doesn’t resolved the the issues.

Please could you advise? Thanks in advance.

for epoch in range(20):

opt_Adam.zero_grad() 

output = model(x_train) 

loss = loss_func(output, y_train)
  1. loss.backward()
    2.loss.backward(retain_graph=True)
opt_Adam.step() 

print(loss.item())

Could you post an executable code snippet to reproduce this issue, please?
You can use random input data (instead of a real dataset), which would make debugging easier.

PS: you can post code snippets by wrapping them into three backticks ``` :wink: