How many times will __getitem__ in the data.Dataset be called?

in a training loop, for example :for batch_idx, (inputs, targets, info) in enumerate(self.train_loader):

getitem is called once for each iteration.

I am trying to keep track of the number of times getitem is called, in my case the total is 8960, the epoch is 140, the batch_size is 16, and the workers is 8. I use 8 Gpus for DDP training.

8960/140 is 64, but then I couldn’t find the connection between the parameters. :frowning:

Does anyone know how 64 came about? Maybe it has something to do with batchsize, but it’s still a 4x difference.

The __getitem__ will be called len(sampler) times, which in the common use cases corresponds to the number of samples, i.e. len(dataset), so that all samples will be indexed, processed, and returned.