Batch size is int but Sampler throws error ValueError: batch_size should be a positive integer value, but got batch_size=4

I am using Sequential sampler and then batch sampler with batch size of 4 but get error:
ValueError: batch_size should be a positive integer value, but got batch_size=4

stack trace:

    self.batch_sampler = torch.utils.data.BatchSampler(self.sampler, self.batch_size, drop_last=False)
  File "/usr/local/lib/python3.7/dist-packages/torch/utils/data/sampler.py", line 217, in __init__
    "but got batch_size={}".format(batch_size))
ValueError: batch_size should be a positive integer value, but got batch_size=4

My code snippet:

self.sampler = torch.utils.data.SequentialSampler(self.dataset)
self.batch_sampler = torch.utils.data.BatchSampler(self.sampler, self.batch_size, drop_last=False)

self.dataset is a custom class which inherits torch.utils.data.Dataset

Am i doing something wrong?

The error message seems to be quite misleading.
Which PyTorch version are you using and could you post an executable code snippet to reproduce this issue, so that we could debug it?

Hi, sorry for the inconvenience. I was creating an installable module and testing it in an environment. Initial version was not converting string to int, and after explicitly doing that, it works. (the installed version was not being updated as i forgot to use --force-reinstall and that’s why i thought it was not working). The error is still a bit misleading as it doesn’t specify the type type of passed value, but it works now.

Good to hear you’ve isolated the issue.
Yeah, I believe that’s the “curse” of the flexibility using the format specifier in Python:

print('{}'.format(4))
> 4
print('{}'.format("4"))
> 4