RuntimeError: Expected 4-dimensional input for 4-dimensional weight [64, 1, 3, 3], but got 5-dimensional input of size [1, 1, 128, 128, 128] instead

Using this UNet implementation: https://github.com/kilgore92/PyTorch-UNet

With this instantiation of the UNet object:

model = UNet(n_channels=1,
             mode='3D',
             num_classes=1,
             use_pooling=True,
             )

Whenever I try running my training script with the dataLoader I get this error: RuntimeError: Expected 4-dimensional input for 4-dimensional weight [64, 1, 3, 3], but got 5-dimensional input of size [1, 1, 128, 128, 128] instead

Here’s more information:

This is my training script, and it is erroring out at the output = model.forward(images).

def train():
    optimizer = torch.optim.Adam(model.parameters(), lr=learning_rate)
    num_steps_train = len(train_loader)
    
    print(num_steps_train)
    
    for epoch in range(epochs):
        print(' - training - ')
        for i, (images, masks) in enumerate(train_loader):
            images = images.to(device)
            masks = masks.to(device)
            outputs = model.forward(images)

The size of images and masks is (1, 1, 128, 128) when I print it out. This is because in my getitem_ in the Dataset object, i’m using image = torch.reshape(image, shape=(1, 128, 128, 128)) to reshape my image to the designated size. I tried changing that to image = torch.reshape(image, shape=(128, 128, 128)) and this doesn’t work either.

This is my train_loader:

train_loader = DataLoader(dataset=Dataset(partition['orig'], partition['segment']), 
                          batch_size = batch_size, shuffle = True)

partition is just a dictionary that has original and segmentation mask images in it.

Hi,

If the size of images and masks are [1, 1, 128, 128] then is not your images 2D?
Also, how can you do

when you have only 128*128 elements?

Although error says inputs are [1, 1, 128, 128, 128] which still I cannot understand how your inputs of size [1, 1, 128, 128] became 3d.

Bests

fixed this. it was just an issue of my reshape.