Hello,
If I set my optimizer and scheduler in the following way, will it prevent the weights for the model_gpt2.transformer.h.parameters()
to not change during the training, while ensuring the weights from model_gpt2.multiple_choice_head.parameters()
to change during the training? I don’t want the model_gpt2.transformer.h.parameters()
to be altered during the training, I only want to adjust model_gpt2.multiple_choice_head.parameters()
with my training loop:
optimizer_1 = torch.optim.SGD(model_gpt2.transformer.h.parameters(), lr=0.0)
optimizer_2 = torch.optim.Adam(model_gpt2.multiple_choice_head.parameters(), lr=0.00013)
scheduler_1 = torch.optim.StepLR(optimizer_1, step_size=1, gamma=1.5)
scheduler_2 = torch.optim.StepLR(optimizer_2, step_size=1, gamma=1.5)
Thank you,