I want to train the same model twice in one loop.Why it still says the computation graph has been freed. What I can do to finish the logic like below. Thanks,everyone.
import torch
import torch.nn as nn
model = nn.Linear(1,1)
optimizer = torch.optim.Adam(model.parameters(),lr=0.1,betas=(0.5, 0.999))
for i in range(200):
optimizer.zero_grad()
input = torch.tensor([1])
output = model(input)
loss_1 = 100 - output
loss_1.backward()
optimizer.step()
optimizer.zero_grad()
input = torch.tensor([2])
output = model(input)
loss_2 = 200 - output
loss_2.backward()
optimizer.step()