Device-side assert triggered at /pytorch/aten/src/THC/generic/

I’m getting this error in evaluation!!

Here is error snippet

/pytorch/aten/src/THC/ void indexSelectLargeIndex(TensorInfo<T, IndexType>, TensorInfo<T, IndexType>, TensorInfo<long, IndexType>, int, int, IndexType, IndexType, long) [with T = float, IndexType = unsigned int, DstDim = 2, SrcDim = 2, IdxDim = -2, IndexIsMajor = true]: block: [7,0,0], thread: [127,0,0] Assertion srcIndex < srcSelectDimSize failed.
THCudaCheck FAIL file=/pytorch/aten/src/THC/generic/ line=59 error=59 : device-side assert triggered

Traceback (most recent call last):
File “”, line 102, in
File “”, line 93, in main
results = evaluate(model, val_loader, maps[“aid_to_ans”], device)
File “/home/cs15resch01005/.local/lib/python2.7/site-packages/torch/autograd/”, line 43, in decorate_no_grad
return func(*args, **kwargs)
File “”, line 30, in evaluate
output = model(img, q, lengths)
File “/home/cs15resch01005/.local/lib/python2.7/site-packages/torch/nn/modules/”, line 489, in call
result = self.forward(*input, **kwargs)
File “/raid/cs15resch01005/vqa/VisualQA/models/”, line 118, in forward
q = self.embedding(ques) # BxTxD
File “/home/cs15resch01005/.local/lib/python2.7/site-packages/torch/nn/modules/”, line 489, in call
result = self.forward(*input, **kwargs)
File “/home/cs15resch01005/.local/lib/python2.7/site-packages/torch/nn/modules/”, line 92, in forward
input = module(input)
File “/home/cs15resch01005/.local/lib/python2.7/site-packages/torch/nn/modules/”, line 489, in call
result = self.forward(*input, **kwargs)
File “/home/cs15resch01005/.local/lib/python2.7/site-packages/torch/nn/modules/”, line 292, in forward
return torch.tanh(input)
RuntimeError: cuda runtime error (59) : device-side assert triggered at /pytorch/aten/src/THC/generic/

Can any one tell me how to go about it…? Thanks in advance



The error comes from q = self.embedding(ques) # BxTxD so inside your embedding layer and the assert states Assertion srcIndex < srcSelectDimSize failed.
So I would guess that one of the index given to your embedding layer in ques is larger than the number of classes it was created with?

Hi…thank you,

I’m bit confused with the answer… am I trying to embed a token which is not in vocabulary or is it issue with the length of the question!!

here is my embedding layer initialization…

    self.embedding = nn.Sequential(
        nn.Embedding(vocab_size, embed_dim, padding_idx=0),
        # nn.Linear(vocab_size, embed_dim),

All the indices passed to the forward should be in [0, vocab_size - 1].
This error happens because ques contains a value that is not in that range.