Are the optimizer parameters different from net.parameters?

Hi. When we define an optimizer such as:
optimizer = torch.optim.Adam(model.parameters(), lr=0.001, amsgrad=True)
And then we manually change the network parameters, for example:

for param in net.parameters():
        do something

Then when we use optimzier.step, does the optimizer take the updates parameters we did in the previous step or the original parameters?