Update specific model using loss.backward()

Is it possible to update a specific model through loss.backward?

output1 = model1(input1)

then, modify input1 using output1.
From now, modified input1 is called input2.

Feed input2 into model2
output2 = model2(input2)

Comput loss
loss = criterion(output2, target)

My logic is like above and I want to fix weights of model2 and update model1 using loss.
How can I do this?


There are few ways to do this. The simplest is just to ignore the gradient computed for model2 and give only model1 to your optimizer. So that only model1 is updated when you do .step().

You can also go through all the parameters of model2 and set requires_grad=False for each of them manually. That way no gradients will be computed for them.


Thanks for your reply!!