Organizing optimizer trainable params per epoch

Hi, I have a network in which I want to train one set of params at odd numbered epochs and another set of params at even numbered epochs. What is the best way to organize this in PyTorch? Thanks!

Hi @Hovnatan_Karapetyan,

Given P1, P2 the two param sets, I think the best way is to create two optimizers with the same hyperparameters:

optim_even = torch.optim.SGD(P1, lr=0.01)
optim_odd  = torch.optim.SGD(P2, lr=0.01)

for e in range(epochs):
    optim = optim_even if e % 2 == 0 else optim_odd
    for x in dataloader:
         #use optim normally here

There might be some further optimizations to do depending on the optimizer you use (SGD should be with his approach) and on your model topology, but this should be sufficient in simple use cases.