What is the best way to specify a learning rate per batch iteration? Let’s say I have (an independently constructed) list [a1,a2,…,a_k]. What I need is for the first batch to have lr = a1, for the second lr = a2 and so on. Epochs are not that important (i.e. that batch = N is the same as batch = 0 is not important).
You could use e.g.
LambdaLR and call
scheduler.step() in each iteration as shown in this small code snippet:
def __init__(self, lrs):
self.lrs = lrs
def get_lr(self, epoch):
lrs = MyLRS(lrs=torch.linspace(1.0, 0.01, 100))
model = nn.Linear(1, 1)
optimizer = torch.optim.Adam(model.parameters(), lr=1.)
for epoch in range(2):
scheduler = torch.optim.lr_scheduler.LambdaLR(optimizer, lrs.get_lr, last_epoch=-1, verbose=True)
for batch_idx in range(99):
Note that you don’t necessarily need to write a custom class for it and could also use a
lambda function directly, if your code allows it.
Alternatively you could manually try to change the learning rate in the optimizer or use another scheduler, if your learning rates follow a specific pattern such as a multiplicative decay.