Batch size on custom dataset

Hi

Do I need to set the batch size a factor of the total training data size? i.e. something like training_size = batch_size * n

You don’t, if you are using dataloader it should handle a batch size that is not a factor of the training data.

1 Like

But I got an error which said something like the size of the last batch does not equal to the previous batch sizes. I used torch.utils.data.DataLoader(training_data, batch_size=64, shuffle=True, num_workers=2)

you can add drop_last=True to your DataLoader constructor, that should fix the problem.
Look at documentation on what that option does.

1 Like