Extended Function does not enter backward

Hi,

I’m trying to extend the Function class but when I call backwards the code does not reach the backward method, any help will be appreciated.

My forward method:

    def forward(ctx, input):
        _index = torch.Tensor().cuda()
        one = torch.ones(1).cuda()

        if len(input.size()) == 1:
            input = torch.unsqueeze(input, 0)

        output = torch.zeros(input.size()).cuda()

        _index = torch.multinomial(input + constants.epsilon, 1, False)

        output.scatter_(1, _index, torch.unsqueeze(one.repeat(_index.size()[0]),1))

        ctx.mark_dirty(input)
        ctx.save_for_backward(input)

        return _index.float()

Why do you put everything in Variables just to unpack it at the end?
Also do you declare both forward and backward as @staticmethod and use your Function with the .apply() method?
And which version of Pytorch are you running?

Hi,

The variables were a legacy from the previous version of the code (as a Module) and were removed.

Forward and Backward are statics.

I use the Function by assigning Function.apply to a variable and then calling the variable as a method.

I am using Pytorch 0.3.1.post2