How to get snapshot of parameter?

I am sorry, I don’t know what is the correct name of what I want. I want to compare value of parameter at the starting point of learning and at the later stage of learning. In order to do so, I used detach() to parameter when learning is about to start. But it seems the detach() value is still changing as learning goes on. Then, I cannot compare original value and trained value of parameter since original value is changing.

I wrote a simple code to reproduce this problem. There is a simple model and it has just two parameters, self.p1 and self.p2. Those values are initially saved using detach() as p1_d and p2_d. And after simple training finishes, I compared values of original p1_d&p2_d and trained p1&p2. It seems the detached values are changed.

How do I get unchanged instance of parameter?

import torch
import torch.nn as nn
import torch.optim as optim

class Model(nn.Module):
    def __init__(self):
        super(Model, self).__init__()
        self.p1 = nn.Parameter(torch.tensor([1.0]))
        self.p2 = nn.Parameter(torch.tensor([100.0]))

    def forward(self):
        return self.p1 + self.p2

model = Model()
criterion = nn.MSELoss()
optimizer = optim.Adam(list(model.parameters()), lr=0.1)

print("before start")
p1_d = model.p1.detach()
p2_d = model.p2.detach()
print(p1_d)
print(p2_d)

for epoch in range(10):
    loss = criterion(model(), torch.tensor(20.0))
    print('loss: %d' % loss)
    optimizer.zero_grad()
    loss.backward()
    optimizer.step()


print('end of training')
print('original_values: ')
print(p1_d) #not 0.0. the value has changed.
print(p2_d) #not 100.0. the value has changed.
print('trained values')
print(model.p1)
print(model.p2)
```

I found the answer. The detach() uses same memory as original tensor. But if I use clone(), it will make independent memory to save value. Therefore, instead of detach(), I should use clone() in this case.

https://discuss.pytorch.org/t/whats-the-difference-between-variable-detach-and-variable-clone/10758