Batch size 44, bad idea?

Hi, as I know everyone use batch sizes multiplied by two. 8,16,32,64,128 etc…
Is it bad to use batch size for example 44? My GPU can not take 64 but I want to save some time so I don’t want to use 32. Batch 44 I can fit to GPU, can I use it or it is bad for training ?

Thank you!

no, it is fine, with multi-dimensional arrays the negative effect is greatly dimished, it is also mostly about features/channels dimension. still, it may be marginally better to choose batch sizes that are divisible by 2,4,8,16,32.