nn.dataParallel and batch size is 1


(Joey Wrong) #1

Hi

May I ask what will happen if the batch size is 1 and the dataParallel is used here, will the data still get splited into mini-batches, or nothing will happen?

Best Regards


(ChengLu She) #2

Just tested it, if the batch_size is 1 and DataParallel is used, only 1 GPU will be used. If the batch_size is larger than 1, 2 GPU will be used.


(Joey Wrong) #3

May I ask did you test on images or nlp?


(ChengLu She) #4

Tested on images. And I only have 2 GPUs, for the case of more GPUs the Batch Size should be larger than the number of the GPUs to make all GPUs work.