You see, almost every training uses batch.
I can do understand how batch and each epoch works when
processing images, because their size (pixels and channels) are same w each other.
But in natural language, lengths of sentence are different and those of documents are too.
and because of that, the size of output is different word to word in word2vec,
or rnn families.
So how can I set the batch size?
Batch means a pile of images in cnn or something,
but does it mean some fixed-size words…?
Please help this noob. Thank you.