Hello guys, I’m trying to add a dropout layer before the FC layer in the “bottom” of my resnet. So, in order to do that, I remove the original FC layer from the resnet18 with the following code:
resnetk = models.resnet18(pretrained=True)
num_ftrs = resnetk.fc.in_features
resnetk = torch.nn.Sequential(*list(resnetk.children())[:-1])
Then, I add the dropout and the FC layer using the num_ftrs I obtained from the previous (original) FC layer of my resnet18:
resnetk.add_module("dropout", nn.Dropout(p=0.5))
resnetk.add_module("fc", nn.Linear(num_ftrs, n_classes))
But I receive the following error : RuntimeError: size mismatch, m1: [8192 x 1], m2: [512 x 2] at /pytorch/aten/src/THC/generic/THCTensorMathBlas.cu:266
I’m also confused where the softmax gets in, after the linear layer, since in Keras we need to specify the activation function as softmax.