How the pytorch freeze network in some layers, only the rest of the training?

Setting .requires_grad = False should work for convolution and FC layers. But how about networks that have instanceNormalization? Is setting .requires_grad = False enough for normalization layers too?

11 Likes