Expected 4-dimensional input for 4-dimensional weight [64, 3, 3, 3], but got 3-dimensional input of size [3, 1936, 2592] instead

I am trying to use CNNs but I can only get it working on MNIST. As soon as I use a DataLoader it all goes wrong. I’ve tried multiple competitions on Kaggle and everytime I get into a circle of DataLoader errors, either size issues, dataype conflicts, etc.

I am aware this problem has come up a lot before and I have read those posts but cannot carry over the solution to my own situation.

Notebook: https://www.kaggle.com/blueturtle/siim-cnn-intro

My DataLoader/collate function are doing something weird too. When unpacking the train_loader whilst trying to train the model, “images” is outputing the target value and “labels” is outputing a tuple of (target, tensor of image pixels, target). Which has lead to me having to do some weird indexing a few lines after. I assume they have swapped around at some point but the getitem function returns images, labels so I am unsure how this has happened.

        for images, labels in enumerate(train_loader):
            steps += 1
            output = model.forward(labels[0][0])

I have also tried iter(train_loader) and that makes no difference.

Many thanks

This problem was solved with:

            images = torch.FloatTensor(images[0])
            images = images.unsqueeze(1)
            images = images.permute(3, 0, 1, 2)

I am still having problems with this notebook for other reasons but this has been solved.