Hi, I am using Stochastic Weight Averaging from torch contrib for the first time. I noticed that model weights are updated at the end of training by calling swap_swa_sgd method and we need to update Batch Normalization also at the end of training.
So I believe while training, I can not do validation simultaneously as weights are not updated. Is my understanding correct?
If yes, is there any workaround to do simultaneous training and validation with SWA?