Hi @Shisho_Sama,
For Tensor
s in most cases, you should go for clone
since this is a PyTorch operation that will be recorded by autograd.
>>> t = torch.rand(1, requires_grad=True)
>>> t.clone()
tensor([0.4847], grad_fn=<CloneBackward>) # <=== as you can see here
When it comes to Module
, there is no clone
method available so you can either use copy.deepcopy
or create a new instance of the model and just copy the parameters, as proposed in this post Deep copying PyTorch modules.