I have a new optimiser that I have designed. But, the thing is that I need comparison and say, “hey, see these optimisers, loss, learning rates etc. and the one that I designed is better”. I have been unable to get any proper learning rate out of optimiser. But, I am seeing that the learning rate can be gotten out of learning rate scheduler though. So, can I use optimiser as a learning rate scheduler?
Also, if I happen to get some time, I am actually thinking about contributing to this nice library. If something like LR_view
of optimiser is needed, I can probably work on it. I am also thinking about integrating my own optimiser that I have been building into PyTorch in some (and I really hope I do that). So, I may follow-up using GitHub if you think that is better.