# Decreasing the learning rate

Hi. I created a network and I have to train it. I used this code to set the optimizer:

“”“optimizer = optim.Adam(model.parameters(), lr = 1e-2)”“”

and this one for the training:

“”“epochs = 20
for epoch in range(epochs):
loss = 0
if name == ‘main’:
for (batch_features, _) in train_loader:
batch_features = batch_features.view(-1, 10694).to(device)
outputs = model(batch_features)
train_loss = criterion(outputs, batch_features)
train_loss.backward()
optimizer.step()
loss += train_loss.item()
loss = loss / len(train_loader)
print(“epoch : {}/{}, loss = {:.6f}”.format(epoch + 1, epochs, loss))”“”

Is there a way to decrease the learning rate (for example lr = lr/1.1) every 3 epochs?

Yes, you can use a learning rate scheduler: MultiplicativeLR is the one you are looking for.

Ok thanks, I have only a doubt about it. I used these instructions before the for loop:

#set the scheduler
scheduler = StepLRScheduler(optimizer, decay_t = 2, decay_rate=0.9)
lr_per_epoch = get_lr_per_epoch(scheduler, epochs)

And the following inside the for loop:

#Scheduler step
scheduler.step(epoch)

Can be ok or do I have to insert all the instructions inside the for loop?

You only need to initialize the scheduler once, so you are correct to leave those first calls out of the loop. Within the outer loop, you can call `scheduler.step()` every three epochs using a conditional like `if epoch % 3 == 0`.

Do you mean something like this?

epochs = 40
scheduler = StepLRScheduler(optimizer, decay_t = 3, decay_rate= 0.9)
lr_per_epoch = get_lr_per_epoch(scheduler, epochs)
for epoch in range(epochs):
loss = 0
if name == ‘main’:
for (batch_features, _) in train_loader:
batch_features = batch_features.view(-1, 10694).to(device)
outputs = model(batch_features)
train_loss = criterion(outputs, batch_features)
train_loss.backward()
optimizer.step()
loss += train_loss.item()
loss = loss / len(train_loader)
if epoch % 3 == 0:
scheduler.step(epoch)
print(“epoch : {}/{}, loss = {:.6f}”.format(epoch + 1, epochs, loss))

Actually, you don’t need the `if epoch % 3 == 0` for `StepLRScheduler`, as it automatically does it for you (you specified to decrease the learning every three epochs when initializing `scheduler`).

Also, I’m not sure which documentation you are reading, but I believe the class you’re looking for is StepLR from `torch.optim.lr_scheduler`, rather than `StepLRScheduler`. Its arguments are called `step_size` and `gamma` rather than `decay_t` and `decay_rate`, and you need not pass `epoch` into `scheduler.step()`.

Thank you for the help.