Return torch.stack(batch, 0, out=out) RuntimeError: stack expects each tensor to be equal size, but got [21] at entry 0 and [23] at entry 1

why this error is occuring?

your batch consists of tensors of different shapes. i guess, maybe pad_sequence or padded_stack from external library would be useful in your case.