Re-initialization of weights

Hello everyone,

is there a mechanism or tool in pytorch, which is capable of detecting that the optimizer is stuck in a local minima and can cause a re initialization of the weights?

Thanks in advance!

There isn’t but that should be a few lines of code to implement.

There are some limited tricks like this with scheduling learning rates: http://pytorch.org/docs/master/optim.html?highlight=scheduler#torch.optim.lr_scheduler.ReduceLROnPlateau