A newbie issue.
I’m using CrossEntropyLoss with data on gpu but I still getting this error:
RuntimeError: dimension out of range (expected to be in range of [-1, 0], but got 1).
I think that my primitive way to load data on gpu is making this problem because when I run neural network on data from cpu everything is working correctly.
device = torch.device("cuda" if torch.cuda.is_available() else "cpu") xy = np.loadtxt('.csv', delimiter=',', dtype=np.float32) x_datai = torch.Tensor(xy[:, 0:-1]).to(device) y_datai = torch.Tensor(xy[:, [-1]]).to(device) train_loader =TensorDataset(x_datai,y_datai) ------------------- for i, data in enumerate(train_loader): inputs,target = data target = target.squeeze(1) optimizer.zero_grad() output = model(inputs).to(device) loss = criterion(output, target.long()) loss.backward() optimizer.step()
I tried to solve this on my own a long time but it’s too much for me.
Can someone explain me how the dataloader should look to not making this issue?
Thanks for help.