Hi,
I did a small test of computing gradient and found the the result is incorrect.
Let’s say, I have x = [x0, x1, x2, x3] and index it by [0, 0, 3] to get y = [x0, x0, x3]. Then, I sum it up as z = y0 + y1 + y2 = 2 * x0 + x3. So the gradient of z with respect to x, should be [2, 0, 0 1]. However, the result of autograd is [1,0,0,1].
The script is as follows:
#!/usr/bin/env python
import torch
from torch.autograd import Variable
x = Variable(torch.rand(4), requires_grad=True)
print 'x = [%s]' % ', '.join(['%g' % i for i in x.data])
y = x[torch.Tensor([0,0,3]).long()]
print 'y = [x[0], x[0], x[3]] = [%s]' % ', '.join(['%g' % i for i in y.data])
z = torch.sum(y)
print 'z = sum(y) = 2 * x[0] + x[3] = %g' % z.data[0]
z.backward()
print 'grad_x(z) = [%s]' % ', '.join(['%g' % i for i in x.grad.data])