Issue with pad_sequence

I m using this code to pad the sequences to the maximum length, but the issue is that the sequences are getting padded to the max len based on the max len it found in the mini-batch it is.
Is there a way to pad every sequence in the dataset to the given length?

def collate_batch(batch):
    text_list, label_list = [], []

    for text, label in batch:
        processed_text = torch.Tensor(text_transform(text))
    return pad_sequence(text_list, padding_value=3.0, batch_first=True), torch.Tensor(label_list)

trainloader =, batch_size=32, shuffle=True, collate_fn=collate_batch)

I don’t think so. In Keras, there is a max length parameter, but not in PyTorch. TO my knowledge, what you are suggesting is unnecessary. Sequence lengths need be equal only for mini-batches, not the entire batch