Is it feasible to call nn.Module().forward in an autograd.Function's forward?

The scenario is to add a hook in the model forward and backward. Is it feasible to call real_model(x) inside hook_func.func() to perform forward of real_model as bellow?

Pseudo code:

class Model(torch.nn.Module):
    def __init__(self):
        super().__init__()
        # model init
        ...
    
    def forward(self, x):
        # computation graph and returns x
        return x

real_model = Model()

class hook_func(torch.autograd.Function):
    @staticmethod
    def forward(ctx, x):
        # some operation on x
        print(x)
        # call the real forward of a custom nn.Module()
        x = real_model(x)
        return x
    ...

If your use case is to hook into the module’s forward and backward, check out the note on module hooks here Modules — PyTorch master documentation

Thanks for your information! Unfortunately, the register_full_backward_hook() can only be called after backward, which cannot fully satisfy my needs. Currently, I separate hook_func into two torch.autograd.Function and customize the Module.