Dataset __getitem__ possible issue

Hi,

I’m have a custom dataset whose _ getitem _ method has one parameter “idx”:
def _ getitem(self, idx):

I use pytorch’s Dataloader with batch_size=1, num_workers=0:
train_loader = DataLoader(my_custom_dataset, batch_size=1, num_workers=0, shuffle=True)

The problem is that my_custom_dataset__getitem__ is called with big idx values, and if I set a break point on DataLoader line 264 batch = self.collate_fn([self.dataset[i] for i in indices])
I can see that there is a valid value for i, but idx parameter is not equal to i.

pytorch 0.4
Win 10 64bit

Nevermind. it was a bug in my code…