Net.load_state_dict does not work

I saved a trained net with an unused layer instance in self function:

class MyNet(nn.Module):
  def __init__(self):
    super().__init__()
    self.layer1 = nn.Conv2d(3, 32, 3)
    self.layer2 = nn.Conv2d(32, 64, 3)
    self.layer_not_used = nn.Conv2d(64, 2, 3)  # not used in forward
  def forward(self, x):
    out = self.layer1(x)
    out = self.layer2(out)
    return out

After several days I want to load the net from disk. But in my new code I have already removed the unused layers in self function. The net is:

class MyNet(nn.Module):
  def __init__(self):
    super().__init__()
    self.layer1 = nn.Conv2d(3, 32, 3)
    self.layer2 = nn.Conv2d(32, 64, 3)
  def forward(self, x):
    out = self.layer1(x)
    out = self.layer2(out)
    return out

And I found that I could not load the trained net successfully.

How to load my trained net successfully when the trained net has some unused layers?

A way to fix your issue might be hidden here: https://pytorch.org/tutorials/recipes/recipes/warmstarting_model_using_parameters_from_a_different_model.html

Maybe it’s enough to do something like:
net = MyNet()
net.load_state_dict(torch.load(PATH), strict=False)

2 Likes