Is it possible for use context of forward in backward path?

thanks in advance.
I want to customize back propagation to my own.
I found the below code, and its working well.

class FeatureExtractor(torch.autograd.Function):
def forward(ctx, input, weight, bias=None, stride=2, padding=3, dilation=1, groups=1):
ctx.save_for_backward(input, weight, bias, confs)

def backward(ctx, grad_output):
     input, weight, bias, confs = ctx.saved_variables

Now I want to test with more big model like resnet18 using imagenet.
using above, code is messy.

Can I use forward backward hook for this purpose.
How can I save some tensor in the forward path, and send it to the backward path??
(is there similar to context in above?)

I want to use some value of forward path in back propagation during convolution.
Is it possible??