Hello,

When using torch.argmax(output, dim=1) to see the predicted classes, I get to see the values 0, 1, 2 when the expected ones are 1,2,3.

I assume there may be an when implementing my code. It’s a multi-class prediction, with an input of 10 variables to predict a target (y). The target has 3 class: 1,2 and 3.

I would appreciate if someone could have a look and let me know what I may be doing wrong.

Here’s my code:

```
class NeuralNet(nn.Sequential):
def __init__(self, input_size, hidden_size, num_classes):
super(NeuralNet, self).__init__()
self.layer1 = nn.Linear(input_size, hidden_size)
self.relu = nn.ReLU()
self.layer2 = nn.Linear(hidden_size, hidden_size)
self.relu = nn.ReLU()
self.layer3 = nn.Linear(hidden_size, num_classes)
def forward(self, x):
out = self.layer1(x)
out = self.relu(out)
out = self.layer2(out)
out = self.relu(out)
out = self.layer3(out)
return out
```

the training

```
vae = NeuralNet(10,6,3)
#device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
#vae.to(device)
loss_function = nn.CrossEntropyLoss()
optimizer = torch.optim.SGD(vae.parameters(), lr = 0.005)
train_y -= 1
test_y -= 1
def train(epoch):
vae.train()
train_loss = 0
for batch_idx, (data,label) in enumerate(train_loader_X):
#data= data.to(device)
#label= label.to(device)
optimizer.zero_grad()
out = vae(data)
loss = loss_function(out, label)
loss.backward()
train_loss += loss.item()
optimizer.step()
if batch_idx % 500 == 0:
#data= data.to(device)
print('Train Epoch: {} [{}/{} ({:.0f}%)]\tLoss: {:.6f}'.format(
epoch, batch_idx * len(data), len(train_loader_X.dataset),
100. * batch_idx / len(train_loader_X), loss.item() / len(data)))
print('====> Epoch: {} Average loss: {:.4f}'.format(epoch, train_loss / len(train_loader_X.dataset)))
```

and the prediction

```
output = vae(test)
predicted = torch.argmax(output,dim =1)
predicted
```

As a result I obtain this:

tensor([2, 1, 0, …, 1, 1, 0])

As mentioned, expected values in above tensor would be 1, 2, 3.

I would be grateful if anyone could shed some light on this.