TorchScript deployment in cpp with wrong results

Hello,

I’ve recently been trying to deploy a V-net which I’ve implemented in Python using PyTorch. I’ve tested the model in Python and it seems to work fine. Then I exported the model via torch.jit.script, which also worked after some adjustments of the used types and adding some additional typing annotation. I then imported the model for deployment in cpp, loaded the exact same inputs (I double checked), but the model produces some entirely different output. The one thing that I found curious was, that when I checked the weights using the named_parameters method, everything seemed to match, but when I called dump_to_str to debug the entire module, some entirely different weights were printed. I’ve also exported the module from cpp to python and checked it again, everything worked fine again. What I suspect is that somehow the weights of the submodules and the saved parameter weights are not correctly matched / loaded in the cpp backend. Did anyone else encounter this problem or knows how to manually set the weights?

Thank you.