Legacy autograd function with non-static forward method is deprecated and will be removed in 1.3

The example in this post might help:

Seems in the new style you would remove the init() entirely, and move whatever argument in your init() to forward() function.

Something like this, but I am not entirely sure. I am also looking for the solution to modify an old Pytorch code:

@staticmethod
def forward(ctx, x, lambd):
     ctx.lambd = lambd
     result = x.view_as(x)
     return result

@staticmethod
def backward(ctx, grad_output):
     return (grad_output * -ctx.lambd), None
2 Likes