How many optimizers do I need?

Hi, all.
I’m wondering how many optimizers are needed to build some architectures.
If my architectures are composed with 3-classifiers and two-encoders,

         -> Classifer1
Encoder1 -> Encoder2 -> Classifier2
         -> Classifier3

One optimizer can train this model well?
When I use one optimizer, I sum up all losses.(In this case, three kinds of classification loss). So I’m not sure that three modules are well trained simultaneously.

The number of losses and optimizers do not depend on each other.
E.g. you could use a single optimizer for the complete model to apply the same optimization method to all parameters.
The loss(es) determine the gradient calculation. Calling .backward() on each loss separately or summing the losses and calling backward once will result in the same gradients.