I understand that the tensors with
requires_grad=False do not calculate gradients. I wonder if I can change the
requires_grad flag during training so as to train/freeze parts of network during training. I want to train some parameters if some conditions are met and freeze them if conditions are not matched. An example code is like below. When I change this flag, the optimizer would reflect it automatically? or do I have to do something to let the optimizer know about it?
.... optimizer = optim.Adam(model.parameters(), lr = lr) .... for epoch in range(total_epochs): .... training goes on .... if certain_condition: certain_parameter.requires_grad=False # 'certain_parameter' is model parameter in model.parameters(). else: certain_parameter.requires_grad=True .... training goes on ....