Scheduler for triggering user defined functions ( no necessary optimizer )

Hi, is it possible to use PyTorch scheduler for triggering functions other than the optimizer ?
for example, here is a typical practice of schedulers

optimizer = torch.optim.SGD(model.parameters(), lr=0.1, momentum=0.9)
scheduler = ReduceLROnPlateau(optimizer, 'min')
for epoch in range(10):
    train(...)
     val_loss = validate(...)
     # Note that step should be called after validate()
    scheduler.step(val_loss)

but I want the best practice for having something like this:

optimizer = torch.optim.SGD(model.parameters(), lr=0.1, momentum=0.9)
scheduler = ReduceLROnPlateau(**my_func**, 'min')
for epoch in range(10):
    train(...)
     val_loss = validate(...)
     # Note that step should be called after validate()
    scheduler.step(val_loss)

The scheduler expects the optimizer as an input to be able to manipulate the learning rate of it.
What’s your use case and how would you like to change the learning rate of the optimizer without passing it?

there is ignite if you need flexible callbacks

I don’t want to change the learning rate, I just want to use the scheduler signal for invoking another function. for example, whenever ReduceLROnPlateau decides to change the learning rate ( because val_loss not going down anymore ) I want use this as an indicator for invoking a function or changing a part of model, not necessarily the learning rate.

def my_func():
    print('hi')
optimizer = torch.optim.SGD(model.parameters(), lr=0.1, momentum=0.9)
scheduler = ReduceLROnPlateau(optimizer, 'min')
for epoch in range(10):
    train(...)
     val_loss = validate(...)
     #scheduler.step(val_loss)
     if scheduler decide to change the learning rate :
          my_func()

In that case you might want to use @googlebot’s suggestion of using callbacks from higher-level wrappers or alternatively derive your custom scheduler from the desired scheduler class and manipulate the step() method as well as its __init__.