Optimizer schedule step

Hi I am training a neural network on pythorch using the training / validation/ test configuration.
for the optimizer I am doing this:

optimizer = optim.Adam(model.parameters(), lr=args.lr)

scheduler = lr_scheduler.ReduceLROnPlateau(optimizer, verbose=True, patience=15, min_lr=1e-6)

but I was wondering if the scheduler step should be taked in respect with the test loss or the validation loss, I mean after every epoch should I do:

scheduler.step(test_loss)

or

scheduler.step(val_loss)

Thank you in advance.

I would use the validation loss and would not use the test loss at all (neither for the scheduler, nor early stopping etc., as I would consider it a data leak).
Once your training is finished using the training and validation datasets, you would use the test dataset once and deploy the model.

1 Like