Question regarding parameters for optimizer

This is related to this thread.

The optimizer have a reference to the model’s parameters, so once you change it anywhere, the optimizer will take it into account. The following snippet shows that the optimizer’s parameters point to the model’s parameters data:

import torch
model = torch.nn.Linear(2, 3)
optimizer = torch.optim.Adam(model.parameters(), lr=0.001, amsgrad=True)
x = set(p.data_ptr() for p in model.parameters())
y = set(p.data_ptr() for p in optimizer.param_groups[0]['params'])
print(x == y) # prints True
1 Like