How to update parameters of one group only in an optimizer?

I have an optimizer, I added three groups to it. I computed some loss, now I want to update only the first group of parameters. How can I do that?

PS: in tensorflow, I explicitly ask for the gradients of the loss with respect to those parameters only. But here in Pyotrch, I just do loss.backward() and I everything is included. What is the equivalent operation in Pytorch?