I’m using this implementation of SegNet in Pytorch, and I want to finetune it. I’ve read online and I’ve found this method (basically freezing all layers except the last one in your net). My problem is that SegNet has more than 100 layers and I’m looking for a simpler way to do it, rather than writing 100 lines of code.
Do you think this could work? Or is this utter nonsense?
import torch.optim as optim model = SegNet() for name, param in model.named_modules(): if name != 'conv11d': # the last layer should remain active param.requires_grad = False optimizer = optim.SGD(model.parameters(), lr=0.01, momentum=0.5) def train(): ...
How can I check if this is working as intended?