PyTorch indexing error message seems weired

I have following code

class F(torch.autograd.Function):
   def forward(self, i):
         self.save_for_backward(i)
         return tr.FloatTensor([1])
   
   def backward(self, c):
         a = tr.zeros(10,10)
         i = self.saved_tensors
         a[i] = tr.ones(3,10) # Error!!!
         return None 

f = F()
f(tr.LongTensor([2,3,4]) 

But this gives the error

TypeError: indexing a tensor with an object of type torch.LongTensor. The only supported types are integers, slices, numpy scalars and torch.LongTensor or torch.ByteTensor as the only argument

This is a weird message, Since given i has type tuple not LongTensor
If i change my code like below

   
   def backward(self, c):
         a = tr.zeros(10,10)
         i = self.saved_tensors[0]
         a[i] = tr.ones(3,10)
         return None 

Then error is resolved

Hi! I can do that without any error

In [13]: a = torch.zeros(10, 10)

In [14]: b = torch.ones(3, 10)

In [15]: a[i] = b

In [16]: a
Out[16]:

    0     0     0     0     0     0     0     0     0     0
    1     1     1     1     1     1     1     1     1     1
    1     1     1     1     1     1     1     1     1     1
    1     1     1     1     1     1     1     1     1     1
    0     0     0     0     0     0     0     0     0     0
    0     0     0     0     0     0     0     0     0     0
    0     0     0     0     0     0     0     0     0     0
    0     0     0     0     0     0     0     0     0     0
    0     0     0     0     0     0     0     0     0     0
    0     0     0     0     0     0     0     0     0     0
[torch.FloatTensor of size 10x10]

Give us more information e.g. PyTorch’s version and your OS etc.

Thanks! But I modified my questions, since the previous example did not reproduce the exact problem I suffered

Hmm, I modified self.save_for_backend to self.save_for_backward because there is no save_for_backend in Function.

class F(torch.autograd.Function):
   def forward(self, i):
         self.save_for_backward(i) # <- fixed here
         return tr.FloatTensor([1])
   
   def backward(self, c):
         a = tr.zeros(10,10)
         i = self.saved_tensors
         a[i] = tr.ones(3,10) 
         return None 

f = F()
f(torch.autograd.Variable(tr.LongTensor([2,3,4])))

Then it returns

Variable containing:
 1
[torch.FloatTensor of size 1]
1 Like

I think the canonical way is to make the right hand side a tuple, i.e.

i, = self.saved_tensors

(note the comma on the left hand side).

Best regards

Thomas