People always talk about ‘number of epochs’ when training a neural network. However, I used to have 1,300 sample images when training a GAN, which gave nice results at epoch 200, but now I have collected 130,000 samples. I am almost sure that I will not need 200 epochs. I think a neural network’s learning progress is linked more to number of iterations, as opposed to number of epochs it was trained. Do you guys have any experience on this issue?
To clarify, I have been training a PG-GAN, training for 10 epochs, growing for the next 10 epochs, and repeat, for 8 growth steps, for a total of 160 epochs + 40 epochs after full growth. Now I have left my machine for training on this new 130,000-sample dataset and set its growth epoch to 1 instead of 10. I’ll see the results tomorrow.