Backpropagation through a layer without updating its parameters

Hello,

I consider a model consists of 3 fully-connected layers: Layer1, Layer2 and Layer3.
In addition, I would like to minimize 2 losses: LossAll and LossPartial.
Regarding to LossAll, I would like to update the weights of all layers.
Regarding to LossPartial, I would like to update the weights of Layer1 and Layer3 only. The gradients should backpropagate through Layer2, but should not update its weights.
In addition, I use a single optimizer, which optimizes the parameters of all layers, i.e. the step function is called once for optimizing both LossAll and LossPartial concurrently.

How can it be implemented?

Is it acceptable to set Layer2.requires_grad_(False) while I compute the forward pass for LossPartial and then to set Layer2.requires_grad_(True) for computing the forward pass for LossAll?

Thank you,
TomerF

Yes it is :slight_smile: That should work as you expect.

1 Like