Infinitive or nan batch loss encountered when shuffling the training data

Hello @Yozey!

I know this is late as in almost a year late, but I faced a similar problem with DataLoader's num_workers parameter. For you or anyone else reading this, please see my post in this forum about this. Also read my response below.

For me the problem was that some of the images were of inconsistent size, but this was only revealed when shuffling the dataset and even then just by chance.