Relation between model parameters and batchsize

I am trying to find relation between batch size and total parameter?

  • Can i increase batchsize of inputs in inference by reducing the model parameters?,
    i tried but could not see any difference :thinking:.

Reducing parameters may refer to prunning of model, consider this topic as discussion

What exactly are you trying to achieve? The only relation between batchsize and number of parameters is the memory they occupy (for a fixed amount of GPU memory if your model is larger, the maximum batchsize will be lower)

i am just trying to figure out relation between these two parameters.
So for example:
model has parameters 100k which has batch size of 16
then optimized model has 80k can i feed batch size greater than 16 here?? is it correct

Yes Iā€™d think so. If you have less parameters in your memory, you can pass a larger batch size.

1 Like