I want to change the scheduler step(loss) code to be able restart Adam/other optimizer state. Can someone suggest me a better way rather than just replace opt = optim.Adam(model.parameters(), lr=new_lr) explicitly ?
Adam stores some momentum data, so if you want to reset the optimiser completely, then your proposal is best.
I want to remove the momentum as well, that is why I want to reset it completely. Currently my code is kinda ugly coz I explicitly replace the optimizer reference and re-initialize the scheduler too with new optimizer.
It would be nice if we can reset directly from the lr_scheduler or call opt.reset().
1 Like
I have this problem too, is it reasonable to reset optimizer.state=collections.defaultdict(dict)
like:
import torch
import torch.nn as nn
from torch import optim
import collections
m = nn.Linear(3, 1)
opt = optim.Adam(m.parameters(), lr=1e-3)
out = m(torch.rand(3))
out.backward()
opt.step()
print(opt.state)
opt.state = collections.defaultdict(dict) # Reset state
print(opt.state)
1 Like
What is the dict
parameter passed to defaultdict()
?
It is just a keyword in python.
defaultdict
need a default_factory such as dict
, list
.