Proper way of fixing batchnorm layers during training

@maunz
I had same problem. It seems I setting eval mode wrong.
This thread helped me Freeze BatchNorm layer lead to NaN

def set_bn_to_eval(m):
classname = m.class.name
if classname.find(‘BatchNorm’) != -1:
m.eval()

net.apply(set_bn_to_eval)
if you want to set_bn_to_eval of some subnet or some basenetwork then just use.
net..apply(set_bn_to_eval)

hope this helps.

3 Likes