Hello Everyone!
I’m trying to make a MultiTask Regressor Model using pytorch.
However, when the code tries to backward a second time, the error is caused.
I put backward(retain_graph=True) and it works, but i think this works the results through both indivisial layer 1 and 2.
I wanna get result seperately trough shared layer, indivisial layer1 and shared layer,indivisial layer2
Could you guys give some advice to me?
Here is my code.
loss_ = [] # loss를 저장할 리스트
for epoch in range(400):
running_loss = 0.0
for _, data in enumerate(trainloader, 0):
inputs, values = data
inputs, values = inputs, values
# forward
shared_result = shared_model(inputs)
outputs_1 = indivi_model_1(shared_result)
outputs_2 = indivi_model_2(shared_result)
# Backward
# Shared Layer
# shared_optimizer.zero_grad()
# shared_optimizer.step()
# # loss.backward()
# Indivisial Layer_1
indivisial_optimizer_1.zero_grad()
loss_1 = criterion(outputs_1, values)
loss_1.backward()
indivisial_optimizer_1.step()
# Indivisial Layer_2
indivisial_optimizer_2.zero_grad()
loss_2 = criterion(outputs_2, values)
loss_2.backward()
indivisial_optimizer_2.step()