Does anybody know what [-1] in optimizer.param_groups?

def fit(epochs,train_set,val_dl,model,lr):

   optimizer = torch.optim.Adam(model.parameters(), lr) #defining the optimizer

   scheduler = torch.optim.lr_scheduler.OneCycleLR(optimizer, lr, epochs=epochs, steps_per_epoch = len(train_set))# learning rate scheduler

   def get_lr():

    for param_group in optimizer.param_groups:  #getting the learning rates of e

      return param_group["lr"]

   history = []      

   

   for epoch in range(epochs):

      model.train()

      train_loss = []

      lrs = []

      for batch in train_set:

        loss = training_step(batch, model)

        train_loss.append(loss)

        loss.backward()

        nn.utils.clip_grad_norm_(model.parameters(), 0.1)

        optimizer.step()

        optimizer.zero_grad()

        lrs.append(get_lr())

        scheduler.step()

      #validation 

      results = validation_combine_loss(val_dl,model)

      results["lrs"] = lrs

      results["train loss"] = torch.stack(train_loss).mean().item()

      epoch_end(results,epoch)

      history.append(results)

   return history

def epoch_end(result,epoch):
  print("epoch: [{}], last_lr {:.5f}####,  Epoch_loss:{:.4f}, Epoch_accuracy {:.4f}, train_loss {:.4f}" 
.format(epoch, result["lrs"][-1]####, result["Loss"], result["Accuracy"], result["train loss"] ))

The areas that need help are denoted with “####” , thanks!

I’m not sure if you are wondering about the usage of:

print("epoch: [{}], last_lr {:.5f}####,  Epoch_loss:{:.4f}, Epoch_accuracy {:.4f}, train_loss {:.4f}" 
.format(epoch, result["lrs"][-1]####, result["Loss"], result["Accuracy"], result["train loss"] ))

but if so:
[-1] indexes the “last” element of the list.

So in this case of the [-1] index, it is going to print out the learning rates from the very last index? If so why is it necessary to implement [-1] index when using

for param_group in optimizer.param_groups:  #getting the learning rates of e

      return param_group["lr"]`

def epoch_end(result,epoch):
  print("epoch: [{}], last_lr {:.5f},  Epoch_loss:{:.4f}, Epoch_accuracy {:.4f}, train_loss {:.4f}" 
.format(epoch, result["lrs"][-1], result["Loss"], result["Accuracy"], result["train loss"] ))

I saw this from a tutorial and I have been trying to figure out what could be the purpose/reason for using [-1] index in this scenario?

I’m not familiar with the tutorial, but would guess the authors wanted to print only the last learning rate and not all learning rates created by the scheduler during the iterations.
The train_loss is also stacked and the mean is taken for all iterations (one epoch).