Initializing a differentiable param in pytorch

I’m trying to define a set of new parameters B in a pytorch model. I would like to initialize the new params with the current weights of the model W.

Question: I want the params B to be differentiable, but autograd should not track their history to W (so B should have a new memory with no reference to W). Is B = nn.Parameter(W.detach().clone()) the correct function to use?

I understand B = W.clone() will result in autograd tracking history of B to W while differentiating. Also I understand that B = W.detach().clone() will not be differentiable.