I am building a neural network for text input that has embedding, CNN, pooling and some linears. However, I am having trouble with how to make it working with variable length. I saw that there is something called padding_idx in embedding. Should I use that? If so, what if the max length of the batch is different from the max length of the whole corpus? Thanks!
And is it a good practice to use padding in this case?