Missing keys & unexpected keys in state_dict when loading self trained model

I use torch.save(model.state_dict(), 'file_name.pt') to save the model. If I don’t use nn.DataParallel to save model, I also don’t need it when I load it, right?
If so, when I torch.save(model.state_dict(), 'mymodel.pt') in one py file during training, I try to load it in a new file with model.load_state_dict(torch.load('mymodel.pt)) , it gives me the error
RuntimeError: Error(s) in loading state_dict for model: Missing key(s) in state_dict: "l1.W", "l1.b", "l2.W", "l2.b".
My model is a self-defined model with two self-constructed layers l1 and l2, each has parameter W and b.
Can you give me any answer to it? Thanks!