How to switch parameters between two model

As mentioned in title, if i’m training A and B model alternatively and A and B model are both in same structure.
During training, iteratively switching all the weights from A to B. Is there any better way to do that or if I’m doing right??
My implementation is something like this:
for s1, s2 in zip(Amodel.A_convLayers.parameters(), Bmodel.B_convLayers.parameters()):
s1 = s2

This is difficult to do unless you do a hack like in Caching parameters, and randomly using one of them for computing gradients .

Still, what you did is wrong, because it only reassigns the local variable s1 rather than the tensor contents.

Im not quite understand “local variable s1 rather than the tensor contents” since isn’t “A_convLayers.parameters()” the way how we get the weights and stuff in layers??

Since the switch process is triggered 0.05 freq/epoch, is it possible if I just save the A model and load as pre-train to overwrite B_model?