Batch size is getting changed

In my code I defined batch size as 50 and after few lines of code I am getting batch size as 42. I did not make any change in batch size and it is getting changed automatically.
What could be the reason behind it?

You have 50*n+42 (42,92,142 etc) training samples, so on nth batch loop you get 42 samples per batch

1 Like

Could you please elaborate it little more? What is “n” over here? In my case I have 146 training examples.

Batch size can’t change on it’s own. The only reasonable suggestion here is that remaining number of elements from the pool during the last batch is less than 50: if you have 146 examples your last batch’ size should be 46. Since it’s 42, you’re missing something with dataloader or your object’s shape.

Thank you so much. I got you. By mistake I wrote 146 it was 292. Now it makes sense.