Error while loading: stack expects each tensor to be equal size

The code I was running is simple:

from tqdm import tqdm
import torch
from torchvision import datasets, transforms

data = datasets.StanfordCars('./data', split='train', transform=transforms.Compose([transforms.ToTensor(), transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))]), download=True)
loader = torch.utils.data.DataLoader(data, 128, shuffle=False, drop_last=True, pin_memory=False)

for i, data in enumerate(tqdm(loader), 0):
    None

The error I got while running this is:

  0%|                                                                                                                                   | 0/63 [00:01<?, ?it/s]
Traceback (most recent call last):
  File "/home/john/research/turbo/test.py", line 7, in <module>
    for i, data in enumerate(tqdm(loader), 0):
  File "/home/john/miniconda3/envs/torch/lib/python3.10/site-packages/tqdm/std.py", line 1195, in __iter__
    for obj in iterable:
  File "/home/john/miniconda3/envs/torch/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 628, in __next__
    data = self._next_data()
  File "/home/john/miniconda3/envs/torch/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 671, in _next_data
    data = self._dataset_fetcher.fetch(index)  # may raise StopIteration
  File "/home/john/miniconda3/envs/torch/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 61, in fetch
    return self.collate_fn(data)
  File "/home/john/miniconda3/envs/torch/lib/python3.10/site-packages/torch/utils/data/_utils/collate.py", line 265, in default_collate
    return collate(batch, collate_fn_map=default_collate_fn_map)
  File "/home/john/miniconda3/envs/torch/lib/python3.10/site-packages/torch/utils/data/_utils/collate.py", line 143, in collate
    return [collate(samples, collate_fn_map=collate_fn_map) for samples in transposed]  # Backwards compatibility.
  File "/home/john/miniconda3/envs/torch/lib/python3.10/site-packages/torch/utils/data/_utils/collate.py", line 143, in <listcomp>
    return [collate(samples, collate_fn_map=collate_fn_map) for samples in transposed]  # Backwards compatibility.
  File "/home/john/miniconda3/envs/torch/lib/python3.10/site-packages/torch/utils/data/_utils/collate.py", line 120, in collate
    return collate_fn_map[elem_type](batch, collate_fn_map=collate_fn_map)
  File "/home/john/miniconda3/envs/torch/lib/python3.10/site-packages/torch/utils/data/_utils/collate.py", line 163, in collate_tensor_fn
    return torch.stack(batch, 0, out=out)
RuntimeError: stack expects each tensor to be equal size, but got [3, 400, 600] at entry 0 and [3, 675, 900] at entry 1

Could anyone please help me solve this issue? I have not even used any model, just the loading is giving me an error.

That is because when internally creating the training batch, torch uses torch.stack to create the batched input tensor. This means they must be of the same shape (C, H, W). Thus, you first need to resize the data by simply adding a resize transform:

hw_size = 256 # Set your desired size here
data=datasets.StanfordCars('./data',split='train',transform=transforms.Compose([transforms.Resize((hw_size, hw_size)),transforms.ToTensor(),transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))]), download=True)

Thanks for your reply!

1 Like