I have been constructing a model that utilizes both squeezenet and tencrop. However when I attempt to train on it I find that I get the error: "
~/anaconda2/envs/pytorch/lib/python3.6/site-packages/torchvision-0.2.0-py3.6.egg/torchvision/models/squeezenet.py in forward(self, x)
99 x = self.features(x)
100 x = self.classifier(x)
--> 101 return x.view(x.size(0), self.num_classes)
102
103
RuntimeError: invalid argument 2: size '[10 x 14]' is invalid for input with 931840 elements at /Users/soumith/minicondabuild3/conda-bld/pytorch_1512381214802/work/torch/lib/TH/THStorage.c:41
My code is pretty basic and looks something like this
model = torchvision.models.squeezenet.squeezenet1_1()
model.num_classes = 14
model.classifier = nn.Sequential(nn.Linear(13, 14))
train_dataset = DataLoader(ImageFolder('/path/to/data',
transform=transforms.Compose([transforms.TenCrop(224), transforms.Lambda(lambda crops: torch.stack([transforms.ToTensor()(crop) for crop in crops]))])
))
for i, (input, target) in train_dataset:
bs, ncrops, c, h, w = input.size()
result = model(input.view(-1, c, h, w))
But then of course when I run this code I get my error. I’m not quite sure if this is a problem on my end or if I just can’t implement TenCrop with SqueezeNet.
squeezenet expects self.classifier to output a tensor of a shape compatible with (batch_size, num_classes) and it expects num_classes to be equal to 1000.
I am not sure I fully understand what your model.classifier is actually doing, but I am fairly certain that it is causing the problem.
I’m sorry, I forgot to add I change my number of classes to 14, which is my desired number. I have modified my post. I will check my classifier though. It’s just a single FC layer so I didn’t think it would be a big deal.
That is not the proper way to modify num_classes. This is…
model = torchvision.models.squeezenet.squeezenet1_1(num_classes = 14)
This way, the builtin model.classifier will produce output with 14 classes. I see no good reason to replace the builtin classifier.
That being said, let’s check the data shape as it goes through the squeezenet… Your images are cropped to 224x224, and by following through the squeezenet source code I find that model.features has output of shape (batches, 512, 13, 13)
Applying nn.Linear(13, 14) to that shape produces output of shape (batches, 512, 13, 14) which is incompatible with .view(batches, 14).
That is the source of the error you get.