The last batch might be smaller, if the number of samples in your Dataset is not divisible without a remainder by your batch size.
If you don’t want this behavior, you could specify drop_last=True in your DataLoader, or alternatively use the (maybe) smaller batch size for the last batch with x = x.view(x.size(0), your_size).
Only the very last batch can be smaller for any batch size, so if you use drop_last=True, you’ll lose (batch_size-1) samples max, which shouldn’t be a problem.
As I said, you could alternatively use the smaller batch and change your reshaping code, but it really depends on your preferences, if your model is not sensitive to the size of the batches (e.g. the batch norm layers might be updated with a noisy stats estimate for small batches).