It seems like Torchtext Iterator doesn’t omit the last batch that usually doesn’t conforms to the batch_size parameter.
If there is a way to omit the last batch from the Iterator?
It seems like Torchtext Iterator doesn’t omit the last batch that usually doesn’t conforms to the batch_size parameter.
If there is a way to omit the last batch from the Iterator?
Are you feeding this Iterator
to a DataLoader
?
If so, you could set drop_last=True
.
I’m not that deeply familiar with Torchtext so let me know, if that would work for you.