Transfer Learning train two last layer (Bloc)

Yes, you can freeze specific layers and train others by manipulating the .requires_grad attribute of each parameter (setting it to False would freeze the parameter). The Finetuning tutorial would give you an example.