Hello,
I’m beginner with pytorch and with examples given in tutorials there is something I can’t understand.
With the Card classifier example, with this model:
class SimpleCardClassifier(nn.Module):
def __init__(self, num_classes=53):
super().__init__()
self.base_model = timm.create_model('efficientnet_b0', pretrained=True)
self.features = nn.Sequential(*list(self.base_model.children())[:-1])
enet_out_size = 1280 # default output of efficient net
# Make a classifier
self.classifier = nn.Linear(enet_out_size, num_classes)
def forward(self, x):
x = self.features(x)
output = self.classifier(x)
return output
If I apply one data loader batch and apply a cross entropy criterion like this:
model = SimpleCardClassifier()
criterion = nn.CrossEntropyLoss()
for images, labels in dataloader:
break
outputs = model(images)
print(outputs.shape)
print(labels.shape)
criterion(outputs, labels)
everything is fine:
torch.Size([64, 53])
torch.Size([64])
tensor(4.0855, grad_fn=<NllLossBackward0>)
But with this model
class CancerDataset(Dataset):
def __init__(self, x_data, transform=None):
self.data = x_data
self.transform = transform
self.nbcol = x_data.shape[1]
def __len__(self):
return len(self.data)
def __getitem__(self, index):
xcurrent = torch.tensor(self.data.iloc[index, 0:(self.nbcol-1)].values)
xcurrent = xcurrent.to(torch.float32)
ycurrent = torch.tensor(self.data.iloc[index, nbcol-1])
if self.transform:
xcurrent = self.transform(xcurrent)
return xcurrent,ycurrent
by applying the same code :
model = CancerModel()
criterion = torch.nn.CrossEntropyLoss()
for features, labels in train_loader:
break
outputs = model(features)
print(outputs.shape)
print(labels.shape)
criterion(outputs, labels.float())
the dimensions of the outputs follow the same way (except one dimensional output)
torch.Size([64, 1])
torch.Size([64])
But I have a error “expected scalar type Long but found Float” because of Criterion, that is not raised if I change the call of loss like this:
criterion(outputs, labels.view(-1,1).float())
that changes the dimension of labels as torch.Size([64,1])
I don’t understand at all what’s going on