I have shared a minimum working example of a CustomDataLoader in pytorch here
When I run the test_dataloader
script with BATCH_SIZE B
, I am seeing the following error when the __getitem__
function is processing the (B+1)th
element:
File "/usr/local/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 106, in _worker_loop
samples = collate_fn([dataset[i] for i in batch_indices])
File "/Users/domarps/Documents/study/pytorch_dataloader/data_loader.py", line 64, in filtered_collate_fn
return torch.utils.data.dataloader.default_collate([x for x in batch if x is not None])
File "/usr/local/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 187, in default_collate
return [default_collate(samples) for samples in transposed]
File "/usr/local/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 187, in <listcomp>
return [default_collate(samples) for samples in transposed]
File "/usr/local/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 173, in default_collate
return torch.stack([torch.from_numpy(b) for b in batch], 0)
File "/usr/local/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 173, in <listcomp>
return torch.stack([torch.from_numpy(b) for b in batch], 0)
ValueError: some of the strides of a given numpy array are negative. This is currently not supported, but will be added in future releases.