in a training loop, for example :for batch_idx, (inputs, targets, info) in enumerate(self.train_loader):
getitem is called once for each iteration.
I am trying to keep track of the number of times getitem is called, in my case the total is 8960, the epoch is 140, the batch_size is 16, and the workers is 8. I use 8 Gpus for DDP training.
8960/140 is 64, but then I couldn’t find the connection between the parameters.
Does anyone know how 64 came about? Maybe it has something to do with batchsize, but it’s still a 4x difference.