Backprop with Multiple Classifier Heads (Linear Layers) with Single CNN (Embedding Network)

Hi,

I was curious as to if anyone has experience training multiple classifier heads but updating a single embedding network (CNN). Is there a way to elegantly call model.backwards() for the shared CNN and the single appropriate classifier head? I would prefer to do it inside a single nn.Module because I believe it will make updating the common CNN easier but I am open to other suggestions.

Thank you!

Alison

How would the classifier heads be selected? If it is ordinary control flow then backprop will only happen through the head that was used.

I was thinking that upon calling model.forward() that the appropriate classifier be selected from a list. If this was the case, would backprop only happen through this head?

I plan to cycle through all classifier heads per training epoch

Yes, I believe backprop would only happen through that particular head. If you want to verify this explicitly you can do something like head.weights.sum() to check which head had its weights updated in a particular iteration.

1 Like

Got it, I’ll give it a try and see! Thanks!