Pre-padding a sequence

Hi, Is there a way of pre-padding a text sequence. The pad_sequence implementation gives post-padding result.

train_data = pad_sequence(train_data, batch_first=True, padding_value=vocab.stoi['<pad>'])
train_data[0]
out: tensor([60, 15,  1,  1,  1,  1,  1,  1,  1,  1,  1,  1,  1,  1,  1,  1,  1,  1, 1,  1,  1,  1])

The result i want to get is:

out: tensor([1,  1,  1,  1,  1,  1,  1,  1,  1,  1,  1,  1,  1,  1,  1,  1, 1,  1,  1,  1, 60, 15])