Saving torch models

Ok, got it!

model = Classifier()
torch.save(model.state_dict(), ‘./model_Q2.pth’)
model.load_state_dict(torch.load(‘./model_Q2.pth’, map_location=lambda storage, loc: storage))

… map_location=lambda storage, loc: storage) : load the model on CPU.

See this: On a cpu device, how to load checkpoint saved on gpu device