Using self trained VGG on **2 GPUs** (problem when loading it)

Hello,

I encounter a problem. I use examples/imagenet/main.py script to train a VGG (trained on 2 GPUs). Everything went fine for training, thus I want to use it as input layers for my neural net. I did not manage to read it. It seems to have a problem loading the model. I read discussions about this topic and try several solutions to convert the GPU model to CPU (as using https://github.com/e-lab/pytorch-toolbox/tree/master/cuda2cpu) but nothing works.
Everytime, there is missing or extra information within the tar fileā€¦

Is there a general solution to this problem ?