How to fix Dimension out of range (expected to be in range of [-1, 0], but got 1) error

Hi, getting an error during model training that im not sure how to interpret. data_tensor contains 4 numbers such as [a, b, c, d] and result is a tensor containing either -1 or 1(what im trying to predict). My code is below. Thank you for your help

class LogReg(nn.Module):
    def __init__(self):
        super(LogReg, self).__init__()
        self.lin1 = nn.Linear(4, 2)
        self.sig1 = nn.Sigmoid()
    
    def forward(self, x):
        x = self.lin1(x)
        y = self.sig1(x)
        return y

device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
model = LogReg()
model = model.to(device)

criterion = nn.CrossEntropyLoss()

learning_rate = 0.01
optimizer = torch.optim.SGD(model.parameters(), lr=learning_rate, momentum=0.9)

def train(num_epoch):
    model.train()
    
    for epoch in range(num_epoch):
        for i in range(len(dataset)):
            
            data_tensor, result = dataset[i]
            data_tensor, result = data_tensor.to(device), result.to(device)
            
            optimizer.zero_grad()
            
            output = model(data_tensor)
            loss = criterion(output, result)
            loss.backward()
            optimizer.step()
            
    
    print("DONE TRAINING for", epoch, "out of", num_epoch)   

I assume that the task is binary classification, then I would use nn.BCELoss, set self.lin1 as nn.Linear(4, 1), and classes as 0 or 1, not -1 or 1.

alright thank you, let me try that out