Nn module using index_select

Hi,

I’m trying to create a nn.Module using index_select and got stuck in a small problem. The code below does nothing, but shows the issue:

import torch
from torch import nn
from torch.autograd import Variable
import numpy as np


class test(nn.Module):
    def forward(self, x):
        a = torch.LongTensor(np.arange(x.size()[1]))
        return x.index_select(1, a)


if __name__ == '__main__':
    dtype = torch.FloatTensor
    x = Variable(torch.zeros(1, 2, 8, 8).type(dtype), requires_grad=False)

    # Works
    model = torch.nn.Sequential(
        test(),
        nn.Conv2d(2, 2, 3, bias=True),
    )
    print(model)
    print(model(x))

    # Works
    model = torch.nn.Sequential(
        test(),
        test(),
        nn.Conv2d(2, 2, 3, bias=True),
    )
    print(model)
    print(model(x))

    # Doesn't work
    model = torch.nn.Sequential(
        nn.Conv2d(2, 2, 3, bias=True),
        test(),
        nn.Conv2d(2, 2, 3, bias=True),
    )
    print(model)
    print(model(x))

As shown in the code, if I use the new module combined with Conv2d a problem happens (only) when I put a convolutional layer before the “test” new module (commented “Doesn’t work”). I get the following error message:

return IndexSelect.apply(self, dim, index)

RuntimeError: save_for_backward can only save input or output tensors, but argument 0 doesn't satisfy this condition

I would appreciate if you could help me out.
Thanks for the great software!

class test(nn.Module):
    def forward(self, x):
        a = torch.LongTensor(np.arange(x.size()[1]))
        return x.index_select(1, a)

here a needs to be a Variable, not a Tensor.

a = Variable(torch.LongTensor(np.arange(x.size()[1])))
4 Likes

This is not pythonic. Why not support a more natural way to indices?

as of v0.2.0 we now support Pythonic way of Advanced indexing. See https://github.com/pytorch/pytorch/releases/tag/v0.2.0 for details.

Thanks for your work. Got it!

<ipython-input-76-7d079b1ce127> in forward(self, title, content)
     51     def forward(self, title, content):
     52         title = self.encoder_tit(title)
---> 53         title_rehape = torch.cat((torch.index_select(title, 2, torch.LongTensor([0,2])), 
     54                    torch.index_select(title, 2, torch.LongTensor([1]))), dim=2)
     55         title_out_1 = self.title_conv_1(title_rehape)

/home/quoniammm/anaconda3/envs/py3Tfgpu/lib/python3.6/site-packages/torch/autograd/variable.py in index_select(self, dim, index)
    679 
    680     def index_select(self, dim, index):
--> 681         return IndexSelect.apply(self, dim, index)
    682 
    683     def gather(self, dim, index):

RuntimeError: save_for_backward can only save input or output tensors, but argument 0 doesn't satisfy this condition

Here, title is a Variable.But it has the same error.What is the reason for it?
what I want to implement is exchange the position of column 2 and column 1.