Hi, I am the first time using the pack_padded_sequence
, and it seems to input the sequence length in a decreasing order. Here is the default pack_padded_sequence
package from official website
torch.nn.utils.rnn.pack_padded_sequence(input, lengths, batch_first=False, enforce_sorted=True)
-
enforce_sorted: if
True
, the input is expected to contain sequences sorted by length in a decreasing order. IfFalse
, the input will get sorted unconditionally. Default:True
It also mentions that enforce_sorted = True is only necessary for ONNX export., what is this mean? If I want to input the length without sorting, can I simply set it to False
?
below is my idea of implement
# in dataset class;
#return the question idx after padding and the original question length
class VQADataset(Dataset):
def __getitem__(self, idx):
return sample{'question': question, 'length':len(qst_token)}
# in model class
class model(nn.Module):
def forward(self, qst, qst_length):
word = self.embedding(qst)
qst_length = qst_length.tolist()
packed = rnn.pack_padded_sequence(word , qst_length, batch_first=True)
So, the length of question will be different every batch. can I simply set the enforce_sorted
into False
?