Optimizer.step() single step?

Hi everyone.

At the moment I’m working on split learning

Forward propagation is done from “n” to “m” layer. And then I calculated gradients x.backward()
The next task is to “Apply gradients up to Ln+2”, which means from layer last layer m until layer n+2.

My question is, when I call optimizer.step(), is it going to apply for all until layer n or just one layer?

optimizer.step() will update all parameters, which were passed to this optimizer during its initialization (or added later via .add_param_group()), using their .grad attributes.

thank you for your answer
Is there any possibility/ way to say “just update until this layer”?

You could create separate optimizers by passing only the desired subset of parameters to them and call the step() function of the corresponding optimizers during the training.