I am using Sequential sampler and then batch sampler with batch size of 4 but get error:
ValueError: batch_size should be a positive integer value, but got batch_size=4
stack trace:
self.batch_sampler = torch.utils.data.BatchSampler(self.sampler, self.batch_size, drop_last=False)
File "/usr/local/lib/python3.7/dist-packages/torch/utils/data/sampler.py", line 217, in __init__
"but got batch_size={}".format(batch_size))
ValueError: batch_size should be a positive integer value, but got batch_size=4
My code snippet:
self.sampler = torch.utils.data.SequentialSampler(self.dataset)
self.batch_sampler = torch.utils.data.BatchSampler(self.sampler, self.batch_size, drop_last=False)
self.dataset is a custom class which inherits torch.utils.data.Dataset
Am i doing something wrong?