How to drop top layer /head of a finetuned transformers model

how to drop top layer or head from a pretrained/fine tuned transformers model.
for example i have fine tuned a deberta-v3-large for NER, and i want to use same model for multi class classification, as data is same so i have decided that i will use same finetuned model for classification task and add a new head in place of NER head and will train for one or 2 epochs, and is there any way to freeze previous layers and train only head on this task like we do in CNN models(like ResNet50).
I am new to pytorch, i am familiar how can we freeze initial layers of CNN in tensorflow, can we do same in pytorch.
thanks!

You can iterate the parameters you want to freeze and set their .requires_grad attribute to False.
Depending on the model architecture, something like this could work:

for param in model.feature_extractor.parameters():
    param.requires_grad = False