I saved a trained net with an unused layer instance in self function:
class MyNet(nn.Module):
def __init__(self):
super().__init__()
self.layer1 = nn.Conv2d(3, 32, 3)
self.layer2 = nn.Conv2d(32, 64, 3)
self.layer_not_used = nn.Conv2d(64, 2, 3) # not used in forward
def forward(self, x):
out = self.layer1(x)
out = self.layer2(out)
return out
After several days I want to load the net from disk. But in my new code I have already removed the unused layers in self function. The net is:
class MyNet(nn.Module):
def __init__(self):
super().__init__()
self.layer1 = nn.Conv2d(3, 32, 3)
self.layer2 = nn.Conv2d(32, 64, 3)
def forward(self, x):
out = self.layer1(x)
out = self.layer2(out)
return out
And I found that I could not load the trained net successfully.
How to load my trained net successfully when the trained net has some unused layers?