There are few ways to do this. The simplest is just to ignore the gradient computed for model2 and give only model1 to your optimizer. So that only model1 is updated when you do .step().
You can also go through all the parameters of model2 and set requires_grad=False for each of them manually. That way no gradients will be computed for them.