How to backward one self-defined block while fixing other layers?

Hi, I want to add a new module into the pretrained backbone, e.g. ResNet-101.(The module is added in block-level) Since ResNet-101 already has pretrained weight, while the new module still needs to be trained. My question is , can I set requires_grad=False in all layers excluding the new module in backbone? I am not sure the gradient can be passed normally in my case ?
A simplified version is one net which consists of A-B-C layers. For now, I set A and C layers’ requires_grad=False. Can B normally accept the backward gradient ?
Appreciate for anyone who can help me!

Yes, that should work. Autograd will make sure to propagate the gradient to previous modules/tensors, if necessary.

Thank you very much!