Pytorch model parameters deepcopy

Hi guys,
I tried to use “deepcopy”, but It changed the parameters(gradients), while I did not want to change it.
The goal is P[1].grad be updated when Itr%2 is zero and save as gradient of Net_Kfast. And when Itr%2 is not zero, I want the P[1].grad remain the same…
#Preformatted text

       for P in zip(net.parameters(), Net_Kfast.parameters(), *net_parameters):
           temp = 0
           for p in P[2:]:
               temp += p.grad/len(Participated_Workers)

           P[0].grad = temp
           if Itr%2 == 0:
               P[1].grad = copy.deepcopy(temp)