Enforce pad_sequence to a certain length

I have a set of tensor that I’m padding with pad_sequence but I need to guarantee a fixed length for them. I can’t do it right now as pad_sequence with extend the shorter tensors up to the longest, if that longest tensor doesn’t reach the length I want them I’m screwed. I thought a solution could be adding zeros to one of the tensors to padd up to the length I want so the result of that padding will have my desired length. I don’t know how to do it

So lets say I have a tensor with shape torch.Size([44]) and a desired length 50, how can I add zeros to it to reach a shape of torch.Size([50])? This needs to hold regardless of the initial tensor shape.


you could add a dummy tensor and remove it afterwards. For example: torch.empty(50).


1 Like

Keras had an option like this. But not in PyTorch. I guess Unity05’s workaround is a good fix.