Stack expects each tensor to be equal size, but got [2, 18432] at entry 0 and [1, 0] at entry 4

I am trying to train a neural network using the data loader

data_loader = torch.utils.data.DataLoader(dataComplete, batch_size=16, shuffle=True)

but it gives me the error at the second for cycle

for epoch in range(num_epochs):
  for i, (video, _, label) in enumerate(tqdm(data_loader)):
    ##training...

The dataset is UCF101 imported as such:

dataComplete = torchvision.datasets.UCF101('ucf_dir','ucf_ann',
                                       frames_per_clip=n_frames, step_between_clips=1, frame_rate=None, fold=1, train=True,
                                       transform=transformF,
                                       _precomputed_metadata=None, num_workers=4, _video_width=0,
                                       _video_height=0, _video_min_dimension=0, _audio_samples=0)

where transformF = transforms.Compose([lambda img:F.interpolate(img.permute(0,3,2,1).float(), size=(H,W)),])

But if I use batch size = 1 (when creating the data_loader) it will work

Check the inputs you are passing to torch.stack, they should be of same size. The error message says they are of different size.

Use this example as reference

a = torch.randn(4,3)
b = torch.randn(4,3)

c = torch.stack([a,b])
c.shape
# torch.Size([2, 4, 3])