Hi,
I have understood that we need to do the following for loading models in pytorch
checkpoint = torch.load(opt.model, map_location=lambda storage, loc: storage)
However, my models are in Torch. So instead of load I do the following:
from torch.utils.serialization import load_lua
checkpoint = torch.load_lua(opt.model)
However, this does not lead to automatic assignment of devices. Right? Is there a way out?