Masking Recurrent layers

I can’t find a solution for this usual problem. How can we mask our input sequences in RNNs?

Check out pack_padded_sequence and pad_packed_sequence - https://pytorch.org/docs/stable/_modules/torch/nn/utils/rnn.html

Thank you for your answer. Could you give me an example, please?
Should I only apply pack_padded_sequence to the padded Tensor and it will mask automatically in the subsequent recurrent layers?

I think this should be what you’re looking for - Simple working example how to use packing for variable-length sequence inputs for rnn

2 Likes