Should use batch norm on my model if I use pretrained model that is trained on batch norm?

I’m working on a VQA model.

And when I use vgg16_bn pretrained without fine tuning, my model shows random acc without batch norm in my model. However, when i put batch norm in my model, model begins training.

So I concluded that in some case, when i use pretrained model trained with batch norm, I should use batch norm on my model.

Is it right?