Swap weights during training

Dear All,

I am implementing a two-branch model. Each branch has the same architecture. During the training process, some layers swap weights.

That is,
some layers in branch_A copy weights from branch_B;
some layers in branch_B copy weights from branch_A.

Any solution for this?
Thank you.

Could you share a minimal reproducible example?

If branch_A and branch_B are their own objects that have all layers in that specific branch you could just do a zip for-loop over each branch like,

for layer_A, layer_B in zip(branch_A, branch_B):`
  for param_A, param_B in zip(layer_A, layer_B):`
    if(branch_A_to_branch_B == True):
      #copy across
    elif(branch_B_to_branch_A == True):
      #copy across
    else:
      #no copy, just pass here...

But do be careful to not overwrite layers that may be updated from one branch but needed in another branch.