Is it ok to use huge batch size?

I am making a market prediction model that is a classification model.
I have 2M rows of data. with 130 features.
I have made a simple NN model.
Currently, I am using a batch size of 50,000 and it occupies 1GB of 16 Gb available in google collab.
So is it ok to use such a huge batch size and should I increase batch size or such huge batch sizes should be avoided?

It should be ok. Moreover, you should be able to increase your learning rate. The only thing that comes to mind that could be useful is to perform some warm up rounds before increasing batch size. This article can give you some interesting insights.

1 Like