How the pytorch freeze network in some layers, only the rest of the training?

does this work even if the network has already been trained? Say I have loaded a pre-trained net on X and I want to freeze layer Y (say 2nd layer to make the example concrete). How do I do that exactly?