Hi Pytorchers,

My DataLoader takes a lot of time for every nth iteration and a fraction of second for all other iterations. What could be the reason? I am using 8 threads for data reading with a batch_size of 128. Here is the log (numbers in the paranthesis are running averages):

Iter: 0 Epoch: [0/40][0/706] Time: 25.277 (25.277) Loss: 2.1912 (2.1912) lr: 0.0300

Iter: 1 Epoch: [0/40][1/706] Time: 0.732 (13.005) Loss: 2.1298 (2.1605) lr: 0.0300

Iter: 2 Epoch: [0/40][2/706] Time: 0.040 (8.683) Loss: 1.9594 (2.0935) lr: 0.0300

Iter: 3 Epoch: [0/40][3/706] Time: 0.012 (6.515) Loss: 1.8457 (2.0315) lr: 0.0300

Iter: 4 Epoch: [0/40][4/706] Time: 0.001 (5.212) Loss: 1.8034 (1.9859) lr: 0.0300

Iter: 5 Epoch: [0/40][5/706] Time: 0.002 (4.344) Loss: 1.9047 (1.9724) lr: 0.0300

Iter: 6 Epoch: [0/40][6/706] Time: 0.001 (3.724) Loss: 1.9777 (1.9731) lr: 0.0300

Iter: 7 Epoch: [0/40][7/706] Time: 0.001 (3.258) Loss: 1.9933 (1.9757) lr: 0.0300

Iter: 8 Epoch: [0/40][8/706] Time: 25.211 (5.697) Loss: 1.9172 (1.9692) lr: 0.0300

Iter: 9 Epoch: [0/40][9/706] Time: 0.677 (5.195) Loss: 1.8515 (1.9574) lr: 0.0300

Iter: 10 Epoch: [0/40][10/706] Time: 0.001 (4.723) Loss: 1.7753 (1.9408) lr: 0.0300

Iter: 11 Epoch: [0/40][11/706] Time: 0.291 (4.354) Loss: 1.7470 (1.9247) lr: 0.0300

Iter: 12 Epoch: [0/40][12/706] Time: 0.001 (4.019) Loss: 1.7333 (1.9100) lr: 0.0300

Iter: 13 Epoch: [0/40][13/706] Time: 0.487 (3.767) Loss: 1.7477 (1.8984) lr: 0.0300

Iter: 14 Epoch: [0/40][14/706] Time: 0.001 (3.516) Loss: 1.7632 (1.8894) lr: 0.0300

Iter: 15 Epoch: [0/40][15/706] Time: 0.104 (3.302) Loss: 1.7588 (1.8812) lr: 0.0300

Thank you in advance for your help.