More than two optimizer with multiple backward

Hello I have a question about backward with multiple loss and three optimziers.

My code is

first_model = Model_first()
second_model = Model_second()
thrid_model = Model_third()

first_opt = torch.optim.SGD(first_model.parameters)
second_opt = torch.optim.SGD(second_model.parameters)
third_opt = torch.optim.SGD(third_model.parameters)

first_result = first_model(data)
second_result = second_model(data)
third_result = third_model(data)

first_opt.zero_grad()
second_opt.zero_grad()

loss1 = torch.nn.MSELoss(first_result, second_result)
loss1.backward()

frist_opt.step()    # update only first model with loss1



third_opt.zero_grad()
loss2 = torch.nn.MSELoss(second_result, third_result)

second_opt.step()  # updated second model with loss1 + loss2
third_opt.step()    # update third model with loss2

I wonder that if those three optimizers are updated like the annotations I wrote.
Especially second and third. (second_opt.step(), third_opt.step())
And could you explain how I check the answer?

The optimizers don’t get valid parameters and should thus raise an error, since you would have to call the parameters method:

first_opt = torch.optim.SGD(first_model.parameters())
second_opt = torch.optim.SGD(second_model.parameters())
third_opt = torch.optim.SGD(third_model.parameters())

The loss calculation should also raise an error, since you are not creating the criterion, but try to pass the tensors to the constructor.
Replace it with:

criterion = torch.nn.MSELoss()
loss1 = criterion(first_result, second_result)

I’m also not sure why your code didn’t raise any errors.

Sorry for the confusion.
I didn’t care about grammatical errors because I was writing simply.
My real code is written with ‘model.parameters()’ and ‘troch.nn.MSELoss()(restult1, result2)’
There is no error in my code, because I wrote the code like you mentioned. Sorry…

But I wonder that if there is no grammatical error, the backward step is really proceeded like annotations.