Batch size suddenly changes during the iterations!

Hi all,

I’m working with GAN model on my dataset, everything going well but the problem is when I tried to print the input size before and after each layer; suddenly at some point (after many successful iterations) the batch size changes (while printing I notice that) and the running stops because it will a problem of dimensions.

So, originally I chose batch_size = 36 and when it changes, it becomes 24!

And as I said it works well after many iterations but then this happens!

This is an example of what I did:

#the x of size [36,1,32,32]
x = F.relu(self.conv1(x))
print(x.shape) #here x will be [36,128,16,16]

And after a number of iterations the x after x = F.relu(self.conv1(x)) become [24,128,16,16]

Is there an explanation for this strange thing, please?

Hi @Sara_Wasl,

Does it always happen at the same iteration? Maybe the last iteration of the first epoch?

If your dataloader has drop_last set to False (which is the default value), and if your dataset is not a multiple/aligned on your batch size, the last iteration an epoch will be equal to epoch_size % batch_size.

Thanks for your reply @spanev,

Yes that’s true, happened in the last iteration of the first epoch.

No, I didn’t add drop_last

The last iteration number =1667, number of epochs = 50 and the batch_size = 36, do you mean I should chose appropriate batch_size? if yes, could you please tell me how?

So drop_last was set to False, by default.
I think that setting it to True is the best thing to do here.

@spanev I tried it just now (set it to True), first epoch works well now but it stops in the first iteration of the second epoch and the batch_size changed to 25!