I’m getting different results when using these two pieces of code interchangeably, but i don’t know why. They should give the same results. It’s the code for my optimizer configuration.
Code 1
return torch.optim.Adam(model.parameters(), weight_decay=1e-6, lr=0.001)
Code 2
updown_weights = []
updown_bias = []
for m in model.modules():
if isinstance(m, mcnnsae_parts.DownConv) or isinstance(m, mcnnsae_parts.UpConv):
for name, p in m.named_parameters():
if 'bias' in name:
updown_bias.append(p)
if 'weight' in name:
updown_weights.append(p)
return torch.optim.Adam([
{'params': updown_bias, 'weight_decay': 1e-6},
{'params': updown_weights, 'weight_decay': 1e-6}],
lr=0.001, weight_decay=1e-6)