Dataloader does not work with inputs of different size

I am not sure if this is a bug or it is build on purpose like this, but i notice if i have data with size of BxCxAxWxH, where A and C can be different from sample to sample the dataloader through an error.

Example:


Training = MyDataset(VideosPath)
for i in range(3):
    sample = Training[i]
    print(i, sample['frames'].size()  )
0 torch.Size([1, 3, 10, 10, 10])
1 torch.Size([1, 3, 10, 10, 10])
2 torch.Size([1, 3, 10, 10, 10])
dataloader = DataLoader(Training, batch_size=2, shuffle=False, num_workers=4)
for i_batch, sample_batched in enumerate(dataloader):
    print(i_batch, sample_batched['frames'].size() )

works fine.

but if i have:

Training = MyDataset(VideosPath)
for i in range(3):
    sample = Training[i]
    print(i, sample['frames'].size()  )
0 torch.Size([1, 3, 90, 10, 10])
1 torch.Size([1, 3, 211, 10, 10])
2 torch.Size([1, 3, 370, 10, 10])

dataloader = DataLoader(Training, batch_size=2, shuffle=False, num_workers=4)
for i_batch, sample_batched in enumerate(dataloader):
    print(i_batch, sample_batched['frames'].size() )

it does not work and throw an error


RuntimeError Traceback (most recent call last)
in ()
----> 1 for i_batch, sample_batched in enumerate(dataloader):
2 print(i_batch, sample_batched[‘frames’].size() )
3

~/anaconda3/lib/python3.6/site-packages/torch/utils/data/dataloader.py in next(self)
284 self.reorder_dict[idx] = batch
285 continue
→ 286 return self._process_next_batch(batch)
287
288 next = next # Python 2 compatibility

~/anaconda3/lib/python3.6/site-packages/torch/utils/data/dataloader.py in _process_next_batch(self, batch)
305 self._put_indices()
306 if isinstance(batch, ExceptionWrapper):
→ 307 raise batch.exc_type(batch.exc_msg)
308 return batch
309

RuntimeError: Traceback (most recent call last):
File “/home/alireza/anaconda3/lib/python3.6/site-packages/torch/utils/data/dataloader.py”, line 57, in _worker_loop
samples = collate_fn([dataset[i] for i in batch_indices])
File “/home/alireza/anaconda3/lib/python3.6/site-packages/torch/utils/data/dataloader.py”, line 135, in default_collate
return {key: default_collate([d[key] for d in batch]) for key in batch[0]}
File “/home/alireza/anaconda3/lib/python3.6/site-packages/torch/utils/data/dataloader.py”, line 135, in
return {key: default_collate([d[key] for d in batch]) for key in batch[0]}
File “/home/alireza/anaconda3/lib/python3.6/site-packages/torch/utils/data/dataloader.py”, line 115, in default_collate
return torch.stack(batch, 0, out=out)
RuntimeError: invalid argument 0: Sizes of tensors must match except in dimension 0. Got 90 and 211 in dimension 3 at /opt/conda/conda-bld/pytorch_1524586445097/work/aten/src/TH/generic/THTensorMath.c:3586

The default collate (read “batching”) function tries to torch.stack your data into one Bx… tensor. You’re getting this error because your dimensions A and C vary, but tensors cannot be jagged. In other words, you can’t stack them since the sizes are different.

Since it seems like variable-sized tensors are desired, one solution would be implementing a custom collate callable and supplying it for the DataLoader’s optional collate_fn argument. You’ll receive a list of outputs from the dataset’s __getitem__ of length batch_size and then it’s up to you to stitch them together into however you want your batch.

Yeah but then other issues will pop up, I see what you are saying though.
Maybe if they just concatenate them instead of stacking (just along the first dimension) this problem does not show up.
Also, to me the most important dimension is the first one (as it said in the error Sizes of tensors must match except in dimension 0), it makes sense to have the dimension 0 to be the same, but i can digest to have all dimensions the same

Since you have two free dimensions, it’s not clear to me how you’ll be able to use torch.concat either. Usually you would have to do some sort of padding if you need one neat tensor and then join the uniform tensors along the batch axis (either by torch.concating a uniformly 1-D axis or by torch.stacking to create a new batch axis - looks like the former is what you need). Usually I would pad the temporal dimension of the videos (I guess that’s A?) but you say that C varies too, so you would have to do something even more clever. I’ve always done padding inside a custom collate function.

Yeah A is the temporal.
For simplicity, I can fix C.
And then I can do padding along A, but i dont want to do that because i dont know what would be the largest A that I will have. But yeah there are some ways to deal with it.

Regardless of solution of how to deal with this specific situation, the error itself is saying that: Sizes of tensors must match except in dimension 0, meaning i can have permuted the dimensions to bring A to dimension 0 and have the rest the same, meaning
0 torch.Size([90, 1, 3, 10, 10])
1 torch.Size([ 211, 1, 3, 10, 10])

but even doing this will give me an error

Oh you’re right!

You should consider opening an issue on Github about the error message. If you look at the docs of torch.stack, you’ll see:

All tensors need to be of the same size.

But when you do:

import torch
a = torch.zeros([1, 20])
b = torch.zeros([2, 20])
torch.stack([a, b])

You get

RuntimeError: invalid argument 0: Sizes of tensors must match except in dimension 0. Got 1 and 2 in dimension 1 at …

My hunch is it’s doing an unsqueeze and a cat in ATen, and the cat throws the error. I’m doing some digging to try to confirm.

1 Like