Error in using torch.gather

File “/home/h/anaconda3/lib/python3.5/site-packages/torch/autograd/”, line 620, in gather
return Gather(dim)(self, index)
RuntimeError: expected a Variable argument, but got LongTensor

The code is

   def forward(self, sentence, lengths):
    embed = self.embedding(sentence)
    packed_seq = pack_padded_sequence(embed, lengths, batch_first=True)

    out, _ = self.lstm(packed_seq)
    unpacked, unpacked_len = pad_packed_sequence(out, batch_first=True)
    maske = torch.LongTensor(unpacked_len).view(-1,1,1).expand_as(unpacked)
    spaces = self.out(unpacked.gather(1,maske)
    return spaces

The docusments says that the index should be a tensor. but It seem not.
If I use Variable(maske). the Error will be
File “/home/h/anaconda3/lib/python3.5/site-packages/torch/autograd/_functions/”, line 542, in forward
return input.gather(self.dim, index)
RuntimeError: Invalid index in gather at /py/conda-bld/pytorch_1490983232023/work/torch/lib/TH/generic/THTensorMath.c:441

should be a Variable(torch.LongTensor( ... ))

‘invalid index’ means you have a value in the LongTensor that is less than 0, or greater than or equal to the first dimension of the embedding matrix

1 Like

Thank you.
It seems that I shoul use

unpacked_len - 1