Is there a "PyTorch-ic" way to do padding and batching?

I’m new to using PyTorch for RNNs and loving it so far. Nonetheless, my model is only training on individual sequences (i.e., batch size = 1).

As I move towards training in batch, I’ve been going through many different implementations for padding and batching the tensors, such as:

I also noticed PyTorch has pad_sequence which seems useful, though it does not return a PackedSequence object.

Is there a PyTorch recommended way to do padding and batching?

1 Like