Simultaneous Training and Validation on SWA

Hi, I am using Stochastic Weight Averaging from torch contrib for the first time. I noticed that model weights are updated at the end of training by calling swap_swa_sgd method and we need to update Batch Normalization also at the end of training.

So I believe while training, I can not do validation simultaneously as weights are not updated. Is my understanding correct?
If yes, is there any workaround to do simultaneous training and validation with SWA?

swap_swa_sgd just swaps the current parameters for the swa ones in each call as you can see in the source code.
If you would like to validate your model, you could call it once before the actual validation, and again afterwards to get your original parameters back.

I’m not sure if you can revert the batchnorm update.

1 Like