How do you go about indexing a variable with another variable? For instance, it’s not clear how you could do a spatial transformer network, since the output of the transformer layer would be a Variable. Does calling
idx.data work, or will that cause the graph to be disconnected?
import torch from torch.autograd import Variable x = Variable(torch.randn(3,3)) idx = Variable(torch.LongTensor([0,1]), requires_grad=True) # doesnt work print(x[idx]) # works, but can you cant call .backward? print(x[idx.data]) t = torch.sum(x[idx.data]) t.backward() # gives an error about no graph nodes require gradients