Reduce learning rate based on training or validation loss?

I have a learning rate scheduler that reduces the learning rate on a plateau of the validation error. However, I am wondering if this is the correct way? Should instead the training error be used?