Morning all,

I am reading in a 2048x420000 numpy array, this array is 420000 samples each 2048 long.

I convert this np array into a tensor using torch.Tensor and then set up a single tensor 1x2048 long for labels.

Using TensorDataset(X_train, y_train) to combine (I think as i understand???) this is then passed into DataLoader with a batch sie of 32.

This then gives me the error

RuntimeError: Expected 3-dimensional input for 3-dimensional weight 16 1, but got 2-dimensional input of size [32, 42000] instead

I’m confused, why isn’t the tensor generated by dataload [32 , 2048]? it is using the size of the row (42000) and not the columns (2048).

chaslie