Yep, the action of retraining a network which has been already (fully) trained is called finetunning. The proper way of explaining other what you did is saying you got X network pretrained on Y dataset and finetuned for W.
you have to set requires_grad = True but finetunning usually requires a small learning rate as it is considered pretrained weights are already closed to optimal ones.