Access A Variable from torch.autograd.function.GradReverseBackward

I have defined a function as follows:

class GradReverse(torch.autograd.Function):
    def __init__(self, lambd):
        self.lambd = lambd

    @staticmethod
    def forward(self, x):
        return x.view_as(x)

    @staticmethod
    def backward(self, grad_output):
        return (grad_output * -self.lambd)

But during the back propagation I get “AttributeError: ‘GradReverseBackward’ object has no attribute ‘lambd’”, even though I set it in the forward pass as GradReverse(lambd).apply(x). Is there something I am missing?

The forward and backward methods are defined as staticmethods and are thus expecting cls as the first argument, while you are expecting to access the self attribute of an object (not the class).
Pass lambd into the forward and use ctx.save_for_backward() or assign it to cls to use it in the backward. The tutorial uses the same approach to store the input tensor which is needed for the gradient computation.

Thank you @ptrblck! That worked!