After summing parameters of two models and storing that parameters to one of that model I am getting zero output but why?

So I am trying to implement paper named Cross Domain FRCN for few shot. There I had to do this,

teacher_parameters = teacher_parameters + ALPHA * student_parameters

So I am doing this,

But when I am doing testing the output for bounding boxes are empty.
Interesting part is this empty output occurs if I sum two parameters. If I just scale one parameter my model gives output. Like,

teacher_parameters = ALPHA * student_parameters

So the problem occurs only if I sum two parameters.

ADDITIONAL INFORMATION

  1. Two models are this one “torchvision.models.detection.fasterrcnn_resnet50_fpn(pretrained=True)”
  2. You can see one teacher and one student model. I am using teacher model for the inference stage.

Hi @CVLab_005,

Firstly, can you share code via 3 backticks ``` instead of sharing screenshots. (Including minimal reproducible example with expected behavior).

Secondly, I think you may have a problem when you’re re-wrapping your parameters and registering them to the .data attribute, which is deprecated. Try the following,

for p_out, p_in1, p_in2 in zip(teacher.parmeters(), student.parameters(), teacher.parameters()):
  p_out.fill_(p_in1 + p_in2)

Also, could you not simplify this via doing an in-place add?

for t, s in zip(teacher.parameters(), student.parameters()):
  t.add_(s)

Thank you for the reply.
But this,

for p_out, p_in1, p_in2 in zip(teacher.parmeters(), student.parameters(), teacher.parameters()):
  p_out.fill_(p_in1 + p_in2)

gives error telling fill_ can only work on tensor with 0 dimension. But my parameters will have more than 0 dimensions.
And I also tried this earlier.

with torch.no_grad():
  for t_param,s_param in zip(teacher.parameters(),student.parameters()):

  #print(type(t_param.data))

   t_param.add_(s_param)

But gives empty output.
I tried another thing like this one,

def update_parameters():

  with torch.no_grad():
    for tparam,sparam in zip(teacher.parameters(),student.parameters()):
      tparam[:2] = tparam[:2] + ALPHA * sparam[:2] 

This works for only once. I mean I can update only once and it works. But if I call this update_parameters function multiple times then it again gives me empty outputs.

Guys I found out the mistake I made while coding. The equation is,
teacher_parameters = ALPHA * teacher_parameters + ( 1 - ALPHA ) * student_parameters
not this,
teacher_parameters = teacher_parameters + ALPHA * student_parameters

Since my teacher’s parameters kept increasing, at the end my bounding boxes kept decresing in size with every iteration and after some iteration it vanishes.

After correcting my code every solution provided above works.
Sorry for the inconvenience and Thanks for the help.