Which modules are affected by the modes except BatchNorm and Dropout? I was wondering in which cases the two modes are interchangeable.
eval() call change the internal
self.training flag, so you could
grep for it in the source folder (
grep -r self.training).
Currently it seems these modules are affected by it in the PyTorch core:
- Quantization modules
- RNN (probably only if using cudnn, as different implementations will be called)
Besides that all custom
nn.Modules might of course use this flag internally.