The easiest way to make a custom RNN compatible with variable-length sequences is to do what this repo does https://github.com/jihunchoi/recurrent-batch-normalization-pytorch – but that won’t be compatible with packedsequence so it won’t be a drop-in replacement for nn.LSTM. The packedsequence approach is fairly specific to the implementation in CUDNN.