Can I replicate a module for more than once?

Hi community,
Is it safe to call nn.parallel.replicate on one module for more than once?

E.g

import torch
a=torch.nn.Linear(3,3);
a.cuda();
ar=torch.nn.parallel.replicate(a,[0,0]);
ar1=torch.nn.parallel.replicate(a,[0])[0];

Which seems to be fine in this toy example…

import torch
a=torch.nn.Linear(3,3);
a.cuda();
ar=torch.nn.parallel.replicate(a,[0,0]);
ar1=torch.nn.parallel.replicate(a,[0])[0];
d1=torch.rand(3,dtype=torch.float32,device="cuda");
d2=torch.rand(3,dtype=torch.float32,device="cuda");
o=torch.optim.SGD(a.parameters());
ar[0](d1).sum().backward();
ar[1](d2).sum().backward();
a.weight.grad;
o.zero_grad();
ar1(d1).sum().backward();
ar[0](d2).sum().backward();
a.weight.grad;

However I don’t know if the replicate function interactive with something global, esp in the origin module (a in this case), which would ruin my day if I use it in some more complex situations
(e.g. puting ar[0], ar[1], ar1 on different devices or different streams)

Thanks in advance.

Regards,
Cat