How to copy weights from torch.nn.sequential weights from source class to target class

Hi,

I’ve been trying to convert torch model to TensorRT model using torch_tensorrt tool. The following code is used to convert model to TensorRT model:

trt_model = torch_tensorrt.compile(torch_model,
                                           inputs = [
                                                    torch_tensorrt.Input((2048, 1, 32, 32),
                                                    dtype=torch.float32)
                                                    ],
                                            enabled_precisions = precision,
                                            workspace_size = 1 << 33
                                          )

However, in forward function of the model uses a global variable and it is causing problem in TensorRT conversion. Here is the error prompt:

python value of type 'float' cannot be used as a value. Perhaps it is a closed over global variable? If so, please consider passing it in as an argument or use a local varible instead.:
  File "/codebase/Net/model.py", line 65
    def forward(self, patch):
        descr = self.desc_norm(self.layers(patch) + eps_l2_norm)
                                                    ~~~~~~~~~~~ <--- HERE
        descr = descr.view(descr.size(0), -1)
        return descr

My initial solution is to add eps_12_norm as class attribute in network class. I tried creating a wrapper torch.nn.module class which loads weights from original model and included eps_12_norm new attribute and saved it as a new model but error still remains the same in TensorRT model conversion. I tried loading the original weights into original class which has new class attribute eps_12_norm, I cannot load the pretrained weights either.

My other solution is that I have to copy the weights from old model to new model with new class attribute since the only change here is new attribute addition. The model has two torch.nn.sequential module, I used the following code to copy the weights from old model to new model:

# copy weights from old model layers module to new model layers module
source_layers = old_model.layers
target_layers = new_model.layers
for source_param, target_param in zip(source_layers.parameters(), target_layers.parameters()):
    target_param.data.copy_(source_param.data)

# copy weights from old model desc_norm module to new model desc_norm module
source_desc_norm = old_model.desc_norm
target_desc_norm = new_model.desc_norm

for source_param, target_param in zip(source_desc_norm.parameters(), target_desc_norm.parameters()):
    target_param.data.copy_(source_param.data)

However, the inference results from two models over same input data is not the same. How can I solve this situation?