For anyone wondering if PyTorch has created a function that takes a list of tensors and pads them to be the same length, the following function does the trick:
https://pytorch.org/docs/stable/nn.html#torch.nn.utils.rnn.pad_sequence
Example:
>>> import torch
>>> l = [torch.Tensor([1., 2.]), torch.Tensor([3.])]
>>> torch.nn.utils.rnn.pad_sequence(l, batch_first=True, padding_value=0)
tensor([[1., 2.],
[3., 0.]])