Hi All. I have a trained Xception model that I want to use as a backbone in another network. In particular I want to drop all the layers starting with the global average pooling. In the code below this means dropping the last block (GAPClassifier).
The problem is that after doing so I get a NotImplementedError
.
I’m assuming that the issue is how I’m dropping the last block, namely
model=torch.nn.Sequential(*(list(model.children())[:-1]))
Are there situations where dropping a block is more complicated? Whats the best way to handle this?
Here is a run through of what is happening (unfortunately I can’t include the entire model in this post). Thanks!
test_batch=torch.rand((3,14,SIZE,SIZE)).cuda()
model(test_batch) #<== executes successfully
def block_names(model,end=None,start=None):
layer_list=list(model.children())
for i,layer in enumerate(layer_list[start:end]):
print(i,layer._get_name())
block_names(model)
"""OUTPUT
0 EntryBlock
1 ModuleList
2 SeparableStack
3 XBlock
4 SeparableStack
5 GAPClassifier
"""
model=torch.nn.Sequential(*(list(model.children())[:-1]))
block_names(model)
"""OUTPUT
0 EntryBlock
1 ModuleList
2 SeparableStack
3 XBlock
4 SeparableStack
"""
model(test_batch) # throws NotImplementedError