Do i need to set same batch size when testing?

Hi, Here is simple question,

I trained DNN containing batchnorm1D. (batch size is 64)
Do i need to set same batch size(64) when testing with test loader?


If you are testing your model, you should set model.eval() mode. Then this would deactivate batchnorm and dropout layers. Then, you don’t need to have the same batch size between the test and train loader.

1 Like