Two optimisers in a network?

Hi,

(I am quite new to neural networks to please forgive me if I say anything that is incorrect)

Q: Is it possible to put 2 different optimisers in a neural network? e.g different optimiser for certain layers

Thanks for your help

James

Hi,

Yes it is possible.
You just need to give the corresponding parameter to the right optimizer.

Could you please explain how this works?

If I have two optimisers in the network, is it performing the same amount of back-propagation compared to if there was just one optimiser?

Thank you :smiley:

The backprop will just populate the .grad field of all parameters given a loss function.
If you have a single loss and two optimizers. You do a single backward and then each optimizer’s step.
If you have one loss per optimizer, then you want to do one backward, one step, zero the gradients, one backward, one step. To get only the gradients corresponding to each loss for the right optimizer.

What would be an example of the ‘one loss per optimiser’ situation?

Thanks :smiley:

Things like GAN, where the generator and discriminator have different losses.