How to load latent dimension of auto-encoder data in data loader

I created a function that takes input as a “list” and converts it into a “dataloader” input.

def load_data(list_):
    if len(list_[0])==2:
        img, lab = [], []
        for i in list_:
            img.append(i[0]), lab.append(i[1])
        xs = torch.stack(img)
        xs = xs.squeeze(1)
        ys = torch.Tensor(lab)
        dataset = TensorDataset(xs, ys)
        del img, lab
    elif len(list_[0])==3:
        img, lab, flag = [], [], []
        for i in list_:
            img.append(i[0]), lab.append(i[1]), flag.append(i[2])
        xs = torch.stack(img)
        xs = xs.squeeze(1)
        ys = torch.Tensor(lab)
        flag = torch.Tensor(flag)
        dataset = TensorDataset(xs, ys, flag)
        del img, lab, flag
    return dataset

dataset = load_data(list_) # Coverting the list into dataset
train_data, val_data = torch.utils.data.random_split(dataset, [int(0.90*len(dataset)), int(0.10*len(dataset)+1)] )
train_loader = DataLoader(train_data, batch_size=BATCH_SIZE, shuffle=True)
val_loader = DataLoader(val_data, batch_size=BATCH_SIZE, shuffle=True)

The error I am getting is because of different sizes which I don’t understand correctly.

Traceback (most recent call last):
  File "newpipeline.py", line 242, in <module>
    dataset = load_data(list_) # Coverting the list into dataset
  File "newpipeline.py", line 26, in load_data
    xs = torch.stack(img)
RuntimeError: invalid argument 0: Sizes of tensors must match except in dimension 0. Got 64 and 28 in dimension 1 at ../aten/src/TH/generic/THTensor.cpp:689

The data I dump as a list from the latent dimension of an autoencoder is of the shape - [Batch-size, 256, 2, 2]

Hi,
Error is happening here, which means imgs that you are stacking do not have same shape. Based on error, for some images, you have 64 and another 28 for height.

Bests

1 Like