I want to remove the last Linear layer and then add_module but I don’t know how to remove.
Or is it okay to just replace it by, model[10].add_module('1', torch.nn.Linear(2048,365))?
I have a problem replicating this behaviour in VGG.
It looks to me as ResNet and VGG are constructed differently. I can’t simply: list(model.children())[:-1] because it retrieves an entire nn.Sequential of 6 modules. And I can’t go deeper in the indexing. I retrieve that same Sequential if I try list(model.children())[-1], list(model.children())[:][-1]``` orlist(model.children())[-1]```
The only way I have found possible to do it is:
model = models.vgg16(pretrained=True)
model.name = 'VGG_Baseline'
last = nn.Linear(list(model.children())[-1][-1].in_features, out_size)
block = nn.Sequential(*list(model.children())[-1][:-1], last)
model = nn.Sequential(*list(model.children())[:-1], block)
However, this way I can’t use requires_grad_(False) for every layer except this last one.