Using linear layer in torch.autograd.Function

net = nn.Linear(out_, in_, bias=False)

class Function(T.autograd.Function):

    @staticmethod
    def forward(ctx, input, net):
 
        output = net(input)
        ctx.save_for_backward(net, input)
        return output

    @staticmethod
    def backward(ctx, grad_output):...

How to use a net inside autograd.Function? Is it possible? What are my options?

What are you trying to achieve?

While it is possible (you’d want to assign the net to ctx._net instead of using save_for_backward, probably), it certainly is not advisable to do so.
A more natural way would be to pass the net.weight as an argument to the forward (and, if you don’t want to differentiate with respect to that, return None).

Best regards

Thomas