Saving the parameters during the update in each epoch

Hello, I am trying to save the update of parameters during training for each epoch, and I have been using the following code below.


startTime = time.time()
for t in range(n_steps):
    # Compute prediction
    loss = - model.compute_elbo_loop(x_train_1, y_train_1,x_train_2, y_train_2)
    # Compute loss
    # Zero gradinets
    optimizer.zero_grad()
    # Compute gradients
    loss.backward()
    #a = list(model.parameters())[0].clone() 
    #print(list(model.parameters())[0].grad,a)
    
    optimizer.step()
    #b = list(model.parameters())[0].clone() 
    #print(b)
    if t % 10 == 9:
        loss_array[int((t + 1) / 10 - 1)] = loss.item()
        time_array[int((t + 1) / 10 - 1)] = time.time() - startTime 
        
        for name, param in model.named_parameters():
            print(name,param)
            param_dictionary[name] = param_dictionary[name] + [param.cpu().detach().numpy()]
        
        # store updated parameters in the dictionary
        params_dict[f't{t}_update{0}'] = model.state_dict().copy()
    
    if t % 10 == 9:
        print(f"Loss: {loss.item()}, Step [{t}/{n_steps}]")
        print(model.ModelString())

endTime = time.time()

I have checked that in each epoch, the parameters are being updated. But when I check the ‘param_dictionary’, all the values are changed to the updated ones. For example, if the current updated value for certain parameter is 3.8, then all the values saved in the dictionary are 3.8. I have used the same dictionary code for other projects, and it worked fine. Is there any specific reason why this is happening in here? am I missing something?

Hi,
This is because .detach() and .numpy() return tensors that share the same storage with the original one.
In order to make a copy, you can use .clone(), so replace your code with the following and it should work.
param.cpu().detach().clone().numpy()

Thank you, Srishti Gureja !

I just checked the code with ‘clone()’, and it saves all the results!! Thank you so much! :slight_smile: