How to train a large data set with many negative samples

My data set has 540,000 images: 40,000 positive images and 500,000 positive images. My computer is configured with only one GTX980TI GPU card and one I5-8400 CPU. It takes nearly 6 hours to run an epoch now. Is there any way to speed up training and reduce overfitting?

As with any others, to speed up you can try using smaller model and parallel data loading (dataloader), making sure one is not a bottle neck for others. To reduce overfiting, you can try stuffs like data augmentation, weight decay, dropout, etc

Don’t hesitate to reduce the number of data used at least to begin with.
Don’ use model with billions of parameters if possible.
Find a machine with a nice GPU, or go for online solution (Amazon, Google, etc).