I am trying to train a neural network using the data loader
data_loader = torch.utils.data.DataLoader(dataComplete, batch_size=16, shuffle=True)
but it gives me the error at the second for cycle
for epoch in range(num_epochs):
for i, (video, _, label) in enumerate(tqdm(data_loader)):
##training...
The dataset is UCF101 imported as such:
dataComplete = torchvision.datasets.UCF101('ucf_dir','ucf_ann',
frames_per_clip=n_frames, step_between_clips=1, frame_rate=None, fold=1, train=True,
transform=transformF,
_precomputed_metadata=None, num_workers=4, _video_width=0,
_video_height=0, _video_min_dimension=0, _audio_samples=0)
where transformF = transforms.Compose([lambda img:F.interpolate(img.permute(0,3,2,1).float(), size=(H,W)),])
But if I use batch size = 1 (when creating the data_loader) it will work