Accuracy drops from 100% to random when removing BatchNorm

Hello !

I’m doing a pretty simple 3D binary classification task using this EfficientNet. I’ve been experimenting quite a lot around BatchNorm and I’ve found something that seems weird to me: when replacing BatchNorm3d by InstanceNorm3d or LayerNorm, all other things being equal, my accuracy drops from 100% to random (that is, around 50%), and the loss doesn’t improve at all.
I expected a performance decrease but not something as radical as perfect to random.
What should I look for ?

Thanks a lot !